SemEval 2015 Task 12 - Aspect Based Sentiment Analysis

Event Notification Type: 
Call for Participation
Abbreviated Title: 
ABSA
Thursday, 4 June 2015 to Saturday, 6 June 2015
State: 
Colorado
Country: 
USA
Contact Email: 
City: 
Denver
Contact: 
Suresh Manandhar
Submission Deadline: 
Friday, 5 December 2014

Call For Participation
SemEval 2015 Task 12 - Aspect Based Sentiment Analysis
http://alt.qcri.org/semeval2015/task12/

TASK DESCRIPTION
****************************************************
This task (ABSA15 for short) is a continuation of SemEval 2014 Task 4 (ABSA14, http://alt.qcri.org/semeval2014/task4/). ABSA15 will focus primarily on the same domains as ABSA14 (restaurants and laptops). However, unlike ABSA14, the input datasets of ABSA15 will contain entire reviews, not isolated (potentially out of context) sentences. Also, ABSA15 consolidates the four subtasks of ABSA14 within a unified framework. Furthermore, ABSA15 will include an out-of-domain subtask, involving test data from a domain unknown to the participants, other than the domains that will be considered during training. ABSA15 consists of the following subtasks.

Subtask 1: In-domain ABSA
****************************************************
Given a review text about a laptop or a restaurant, identify the following information:

Slot 1: Aspect Category. Identify every entity (E) and attribute (A) pair (E#A) towards which an opinion is expressed in the given text. E and A should be chosen from predefined domain-specific inventories of entity types (e.g. laptop, keyboard, operating system, restaurant, food, drinks) and attribute labels (e.g. performance, design, price, quality). Each E#A pair is considered an aspect category of the given text. The inventories of entity types and attribute labels are described in the annotation guidelines; see http://alt.qcri.org/semeval2015/task12/index.php?id=data-and-tools. Some examples highlighting the required information follow:

a. It is extremely portable and easily connects to WIFI at the library and elsewhere. →{LAPTOP#PORTABILITY}, {LAPTOP#CONNECTIVITY}
b. The exotic food is beautifully presented and is a delight in delicious combinations. → {FOOD#STYLE_OPTIONS}, {FOOD#QUALITY}

Slot 2 (Only for the restaurants domain): Opinion Target Expression (OTE). An opinion target expression (OTE) is an expression used in the given text to refer to the reviewed entity E of a pair E#A. The OTE is defined by its starting and ending offsets in the given text. The OTE slot takes the value “NULL”, when there is no (explicit) mention of the entity E. Below are some examples:

a. Great for a romantic evening, but over-priced. → {AMBIENCE#GENERAL, “NULL”}, {RESTAURANT# PRICES, “NULL”}
b. The fajitas were delicious, but expensive. → {FOOD#QUALITY, “fajitas”}, {FOOD# PRICES, “fajitas”}

Slot 3: Sentiment Polarity. Each identified E#A pair of the given text has to be assigned a polarity (positive, negative, or neutral). The neutral label applies to mildly positive or mildly negative sentiment, as in the second example below.

a. The applications are also very easy to find and maneuver. → {SOFTWARE#USABILITY, positive}
b. The fajitas are nothing out of the ordinary”. → {FOOD#GENERAL, “fajitas”, neutral}

Subtask 2: Out-of-domain ABSA
****************************************************
The participating teams will be asked to test their systems in a previously unseen domain for which no training data will be made available. The gold annotations for Slot1 will be provided and the teams will be required to return annotations for Slot 3 (sentiment polarity) only.

DATASETS
****************************************************
Two datasets of ~550 reviews of laptops and restaurants annotated as above are already available for training. Additional datasets will be provided to evaluate the participating systems in Subtask 1 (in-domain ABSA). Information about the domain adaptation dataset of Subtask 2 (out-of-domain ABSA) will be provided later.

IMPORTANT DATES
****************************************************
Evaluation period starts: December 5, 2014
Evaluation period ends: December 22, 2014
Paper submission due: January 30, 2015
Paper reviews due: February 28, 2015
Camera ready due: March 30, 2015
SemEval workshop: June 4-5, 2015 (co-located with NAACL-2015 in Denver, Colorado)

MORE INFORMATION
****************************************************
The Semeval-2015 Task 12 website includes further details on the training data, evaluation, and examples of expected system outputs:http://alt.qcri.org/semeval2015/task12/

Registration at: http://alt.qcri.org/semeval2015/index.php?id=registration
Join our mailing list: https://groups.google.com/d/forum/semeval-absa
Email: semeval-absa [at] googlegroups.com

ORGANIZERS
****************************************************
Ion Androutsopoulos (Athens University of Economics and Business, Greece)
Dimitris Galanis (“Athena” Research Center, Greece)
Suresh Manandhar (University of York, UK) [Primary Contact]
Harris Papageorgiou ("Athena" Research Center, Greece)
John Pavlopoulos (Athens University of Economics and Business, Greece)
Maria Pontiki (“Athena” Research Center, Greece)

--
You received this message because you are subscribed to the Google Groups "SemEval-ABSA" group.
To unsubscribe from this group and stop receiving emails from it, send an email to semeval-absa+unsubscribe [at] googlegroups.com.
For more options, visit https://groups.google.com/d/optout.