Difference between revisions of "VerbOcean - RTE Users"

From ACL Wiki
Jump to navigation Jump to search
m
Line 1: Line 1:
 
When not otherwise specified, the data about version, usage and evaluation of the resource have been provided by participants themselves.<br>
 
When not otherwise specified, the data about version, usage and evaluation of the resource have been provided by participants themselves.<br>
 
<br>
 
<br>
{|class="wikitable sortable" cellpadding="3" cellspacing="0" border="1" style="margin-left: 20px;"
+
{|class="wikitable sortable" cellpadding="3" cellspacing="0" border="1"
 
|- bgcolor="#CDCDCD"
 
|- bgcolor="#CDCDCD"
 
! nowrap="nowrap"|Participants*
 
! nowrap="nowrap"|Participants*
Line 94: Line 94:
 
{|class="wikitable" cellpadding="3" cellspacing="0" border="0" style="margin-left: 20px;"
 
{|class="wikitable" cellpadding="3" cellspacing="0" border="0" style="margin-left: 20px;"
 
|-
 
|-
! align="left"|Total: 5
+
! align="left"|Total: 11
 
|}
 
|}
 
<br>
 
<br>

Revision as of 08:33, 21 December 2009

When not otherwise specified, the data about version, usage and evaluation of the resource have been provided by participants themselves.

Participants* Campaign Version Specific usage description Evaluations / Comments
DFKI RTE5 FIRST USE: VerbOcean relations are used to calculate relatedness between verbs in T and H.
SECOND USE: used to assign relatedness between nominal predicates in T and H,after using WordNet to change the verbal nouns into verbs.
FIRST USE: Ablation test performed. Impact of the resource: null/+0.17% accuracy respectively on two-way and three-way task for run1; +0.33%/+0.5% for run2; +0.17%/+0.17% for run3.
SECOND USE (WordNet+VerbOcean): null/+0.17% accuracy respectively on two-way and three-way task for run1; +0.5%/+0.67% for run2; +0.17%/+0.17% for run3.
DLSIUAES RTE5 FIRST USE: Antonymy relation between verbs.

SECOND USE: VerbOcean relations used to find correspondence between verbs.

FIRST USE: Ablation test performed together with WordNet and DLSIUAES_negation_list. Positive impact on two-way run: +0.66% accuracy. Negative impact on three-way run: -1% (-0.5% for two-way derived).

SECOND USE: No ablation test performed.

FBKirst RTE5 Extraction of 18232 entailment rules for all the English verbs connected by the ”stronger-than” relation. For instance, if ”kill [stronger-than] injure”, then the rule ”kill ENTAILS injure” is added to the rules repository. Ablation test performed. Negative impact of the resource: -0.16% accuracy on two-way task.
QUANTA RTE5 We use "opposite-of" relation in VerbOcean as a feature Ablation test performed. Null impact of the resource on two-way task.
Siel_09 RTE5 Similarity/anthonymy/unrelatedness between verbs Ablation test performed. Null impact of the resource both on two-way and on three-way task.
UAIC RTE5 "Opposite-of" relation to detect contradiction. Used in combination with WordNet. Ablation test performed (Wordnet + VerbOcean). Positive impact of the two resources together: +2% accuracy on two-way, +1.5% on three-way task.
DFKI RTE4 Unrefined Semantic relation between verbs. No separate evaluation.
DLSIUAES RTE4 Data taken from the RTE4 proceedings. Participants are recommended to add further information.
UAIC RTE4 Data taken from the RTE4 proceedings. Participants are recommended to add further information.
UPC RTE4 Data taken from the RTE4 proceedings. Participants are recommended to add further information.
VENSES RTE3 Semantic relation between words No evaluation of the resource
New user Participants are encouraged to contribute.
Total: 11


[*] For further information about participants, click here: RTE Challenges - Data about participants

   Return to RTE Knowledge Resources