Difference between revisions of "SemEval Portal"

From ACL Wiki
Jump to navigation Jump to search
Line 72: Line 72:
  
 
On [http://en.wikipedia.org/wiki/Main_Page Wikipedia], a [http://en.wikipedia.org/wiki/SemEval SemEval] page had been created and it is calling for contributions and suggestions on how to improve the Wikipedia page, to advocate SemEval, and to further the understanding of computational semantics.
 
On [http://en.wikipedia.org/wiki/Main_Page Wikipedia], a [http://en.wikipedia.org/wiki/SemEval SemEval] page had been created and it is calling for contributions and suggestions on how to improve the Wikipedia page, to advocate SemEval, and to further the understanding of computational semantics.
 +
 +
Currently, there are [http://en.wikipedia.org/wiki/Wikipedia:Articles_for_deletion/SemEval#SemEval discussions on the possible deletion] of the wikipedia page on SemEval, please help to verify the [http://en.wikipedia.org/wiki/Wikipedia:Notability notability] and suggestions to improve the [http://en.wikipedia.org/wiki/SemEval SemEval wikipage].
 +
  
 
[[Category:SemEval Portal]]
 
[[Category:SemEval Portal]]

Revision as of 21:31, 8 November 2010

This page serves as a community portal for everything related to Semantic Evaluation (SemEval).

Semantic Evaluation Exercises

The purpose of the SemEval exercises and SENSEVAL is to evaluate semantic analysis systems. The first three evaluations, Senseval-1 through Senseval-3, were focused on word sense disambiguation, each time growing in the number of languages offered in the tasks and in the number of participating teams. Beginning with the 4th workshop, SemEval-2007 (SemEval-1), the nature of the tasks evolved to include semantic analysis tasks outside of word sense disambiguation. This portal will be used to provide a comprehensive view of the issues involved in semantic evaluations.

Upcoming and Past Events

Event Year Location Notes
SemEval 3 to be determined to be determined - discussion at SemEval 3 Group
SemEval 2 2010 Uppsala, Sweden - proceedings
SemEval 1 2007 Prague, Czech Republic - proceedings
- copy of website at Internet Archive
SENSEVAL 3 2004 Barcelona, Spain - proceedings
SENSEVAL 2 2001 Toulouse, France - main link provides links to results, data, system descriptions, task descriptions, and workshop program
- copy of website at Internet Archive
SENSEVAL 1 1998 East Sussex, UK - papers in Computers and the Humanities, subscribers or pay per view


Tasks in Semantic Evaluation

The major tasks in semantic evaluation include:

  • Word-sense disambiguation (lexical sample)
  • Word-sense disambiguation (all-words)
  • Multi-lingual or cross-lingual word-sense disambiguation
  • Subcategorization acquistion
  • Semantic role labeling
  • Word-sense induction
  • Semantic relation identification
  • Metonymy resolution
  • Temporal information processing
  • Lexical substitution
  • Evaluation of lexical resources
  • Coreference resolution
  • Sentiment analysis

This list is expected to grow as the field progresses.

Organization

SIGLEX, the ACL Special Interest Group on the Lexicon is the umbrella organization for the SemEval semantic evaluations and the SENSEVAL word-sense evaluation exercises. SENSEVAL is the home page for SENSEVAL 1-3. Each exercise is usually organized by two individuals, who make the call for tasks and handle the overall administration. Within the general guidelines, each task is then organized and run by individuals or groups.

SemEval on Wikipedia

On Wikipedia, a SemEval page had been created and it is calling for contributions and suggestions on how to improve the Wikipedia page, to advocate SemEval, and to further the understanding of computational semantics.

Currently, there are discussions on the possible deletion of the wikipedia page on SemEval, please help to verify the notability and suggestions to improve the SemEval wikipage.