Difference between revisions of "SemEval Portal"
(SemEval on Wikipedia) |
|||
Line 71: | Line 71: | ||
==[http://en.wikipedia.org/wiki/SemEval, Semeval] on Wiki== | ==[http://en.wikipedia.org/wiki/SemEval, Semeval] on Wiki== | ||
− | On the public wikipedia, a [http://en.wikipedia.org/wiki/SemEval, SemEval] page had been created for Semeval and it is calling for inputs and suggestions on how to improve the wiki page to advocate, educate, proliferate SemEval / Senseval | + | On the public wikipedia, a [http://en.wikipedia.org/wiki/SemEval, SemEval] page had been created for Semeval and it is calling for inputs and suggestions on how to improve the wiki page to advocate, educate, proliferate SemEval / Senseval; to further the public and academics's understanding of computational semantics. |
[[Category:SemEval Portal]] | [[Category:SemEval Portal]] |
Revision as of 08:53, 27 October 2010
This page serves as a community portal for everything related to Semantic Evaluation (SemEval).
Semantic Evaluation Exercises
The purpose of the SemEval exercises and SENSEVAL is to evaluate semantic analysis systems. The first three evaluations, Senseval-1 through Senseval-3, were focused on word sense disambiguation, each time growing in the number of languages offered in the tasks and in the number of participating teams. Beginning with the 4th workshop, SemEval-2007 (SemEval-1), the nature of the tasks evolved to include semantic analysis tasks outside of word sense disambiguation. This portal will be used to provide a comprehensive view of the issues involved in semantic evaluations.
Upcoming and Past Events
Event | Year | Location | Notes |
---|---|---|---|
SemEval 3 | to be determined | to be determined | - discussion at SemEval 3 Group |
SemEval 2 | 2010 | Uppsala, Sweden | - proceedings |
SemEval 1 | 2007 | Prague, Czech Republic | - proceedings - copy of website at Internet Archive |
SENSEVAL 3 | 2004 | Barcelona, Spain | - proceedings |
SENSEVAL 2 | 2001 | Toulouse, France | - main link provides links to results, data, system descriptions, task descriptions, and workshop program - copy of website at Internet Archive |
SENSEVAL 1 | 1998 | East Sussex, UK | - papers in Computers and the Humanities, subscribers or pay per view |
Tasks in Semantic Evaluation
The major tasks in semantic evaluation include:
- Word-sense disambiguation (lexical sample)
- Word-sense disambiguation (all-words)
- Multi-lingual or cross-lingual word-sense disambiguation
- Subcategorization acquistion
- Semantic role labeling
- Word-sense induction
- Semantic relation identification
- Metonymy resolution
- Temporal relation identification
- Lexical substitution
- Evaluation of lexical resources
- Coreference resolution
- Sentiment analysis
This list is expected to grow as the field progresses.
Organization
SIGLEX, the ACL Special Interest Group on the Lexicon is the umbrella organization for the SemEval semantic evaluations and the SENSEVAL word-sense evaluation exercises. SENSEVAL is the home page for SENSEVAL 1-3. Each exercise is usually organized by two individuals, who make the call for tasks and handle the overall administration. Within the general guidelines, each task is then organized and run by individuals or groups.
Semeval on Wiki
On the public wikipedia, a SemEval page had been created for Semeval and it is calling for inputs and suggestions on how to improve the wiki page to advocate, educate, proliferate SemEval / Senseval; to further the public and academics's understanding of computational semantics.