Difference between revisions of "VerbOcean - RTE Users"

From ACL Wiki
Jump to navigation Jump to search
(New page: <br /> {|class="wikitable sortable" cellpadding="3" cellspacing="0" border="1" style="margin-left: 20px;" |- bgcolor="#CDCDCD" ! nowrap="nowrap"|Partecipants ! nowrap="nowrap"|Campaign ! n...)
 
m
 
(22 intermediate revisions by the same user not shown)
Line 1: Line 1:
<br />
+
<br>
{|class="wikitable sortable" cellpadding="3" cellspacing="0" border="1" style="margin-left: 20px;"
+
{|class="wikitable sortable" cellpadding="3" cellspacing="0" border="1"
 
|- bgcolor="#CDCDCD"
 
|- bgcolor="#CDCDCD"
! nowrap="nowrap"|Partecipants
+
! nowrap="nowrap"|Participants*
 
! nowrap="nowrap"|Campaign
 
! nowrap="nowrap"|Campaign
 
! nowrap="nowrap"|Version
 
! nowrap="nowrap"|Version
 
! nowrap="nowrap"|Specific usage description
 
! nowrap="nowrap"|Specific usage description
! nowrap="nowrap"|Evalutations / Comments
+
! nowrap="nowrap"|Evaluations / Comments
 +
 
 
|- bgcolor="#ECECEC" align="left"
 
|- bgcolor="#ECECEC" align="left"
| Del Monte
+
| DFKI
| RTE3
+
| RTE5
 +
|
 +
| FIRST USE: VerbOcean relations are used to calculate relatedness between verbs in T and H.<br/>SECOND USE: used to assign relatedness between nominal predicates in T and H,after using WordNet to change the verbal nouns into verbs.
 +
| FIRST USE: Ablation test performed. Impact of the resource: null/+0.17% accuracy respectively on two-way and three-way task for run1; +0.33%/+0.5% for run2; +0.17%/+0.17% for run3.<br/>SECOND USE (WordNet+VerbOcean): null/+0.17% accuracy respectively on two-way and three-way task for run1; +0.5%/+0.67% for run2; +0.17%/+0.17% for run3.
 +
 
 +
|- bgcolor="#ECECEC" align="left"
 +
| DLSIUAES
 +
| RTE5
 +
|
 +
| FIRST USE: Antonymy relation between verbs.<br/>
 +
SECOND USE: VerbOcean relations used to find correspondence between verbs.
 +
| FIRST USE: Ablation test performed together with WordNet and DLSIUAES_negation_list. Positive impact on two-way run: +0.66% accuracy. Negative impact on three-way run: -1% (-0.5% for two-way derived).<br/>
 +
SECOND USE: No ablation test performed.
 +
 
 +
|- bgcolor="#ECECEC" align="left"
 +
| FBKirst
 +
| RTE5
 +
|
 +
| Extraction of 18232 entailment rules for all the English verbs connected by the ”stronger-than” relation. For instance, if ”kill [stronger-than] injure”, then the rule ”kill ENTAILS injure” is added to the rules repository.
 +
| Ablation test performed. Negative impact of the resource: -0.16% accuracy on two-way task.
 +
 
 +
|- bgcolor="#ECECEC" align="left"
 +
| QUANTA
 +
| RTE5
 +
|
 +
| We use "opposite-of" relation in VerbOcean as a feature
 +
| Ablation test performed. Null impact of the resource on two-way task.
 +
 
 +
|- bgcolor="#ECECEC" align="left"
 +
| Siel_09
 +
| RTE5
 +
|
 +
| Similarity/anthonymy/unrelatedness between verbs
 +
| Ablation test performed. Null impact of the resource both on two-way and on three-way task.
 +
 
 +
|- bgcolor="#ECECEC" align="left"
 +
| UAIC
 +
| RTE5
 
|  
 
|  
| Semantic relation between words
+
| "Opposite-of" relation to detect contradiction. Used in combination with WordNet.
| No evaluation of the resource
+
| Ablation test performed (Wordnet + VerbOcean). Positive impact of the two resources together: +2% accuracy on two-way, +1.5% on three-way task.
 +
 
 
|- bgcolor="#ECECEC" align="left"
 
|- bgcolor="#ECECEC" align="left"
 
| DFKI
 
| DFKI
Line 24: Line 63:
 
|  
 
|  
 
|  
 
|  
| ''Data extracted from proceedings. Partecipants are recommended to edit fields''
+
| ''Data taken from the RTE4 proceedings. Participants are recommended to add further information.''
 
|- bgcolor="#ECECEC" align="left"
 
|- bgcolor="#ECECEC" align="left"
 
| UAIC
 
| UAIC
Line 30: Line 69:
 
|  
 
|  
 
|  
 
|  
| ''Data extracted from proceedings. Partecipants are recommended to edit fields''
+
| ''Data taken from the RTE4 proceedings. Participants are recommended to add further information.''
 
|- bgcolor="#ECECEC" align="left"
 
|- bgcolor="#ECECEC" align="left"
 
| UPC
 
| UPC
Line 36: Line 75:
 
|  
 
|  
 
|  
 
|  
| ''Data extracted from proceedings. Partecipants are recommended to edit fields''
+
| ''Data taken from the RTE4 proceedings. Participants are recommended to add further information.''
 +
 
 +
|- bgcolor="#ECECEC" align="left"
 +
| VENSES
 +
| RTE3
 +
|
 +
| Semantic relation between words
 +
| No evaluation of the resource
 +
 
 
|- bgcolor="#ECECEC" align="left"
 
|- bgcolor="#ECECEC" align="left"
 
| ''New user''
 
| ''New user''
Line 42: Line 89:
 
|  
 
|  
 
|  
 
|  
| ''Partecipants are recommended to edit fields''
+
| ''Participants are encouraged to contribute.''
 
|}
 
|}
 
{|class="wikitable" cellpadding="3" cellspacing="0" border="0" style="margin-left: 20px;"
 
{|class="wikitable" cellpadding="3" cellspacing="0" border="0" style="margin-left: 20px;"
 
|-
 
|-
! align="left"|Total: 3
+
! align="left"|Total: 11
 
|}
 
|}
 +
<br>
 +
 +
[*] For further information about participants, click here: [[RTE Challenges - Data about participants]]
 +
 +
    Return to [[RTE Knowledge Resources]]

Latest revision as of 05:19, 22 December 2009


Participants* Campaign Version Specific usage description Evaluations / Comments
DFKI RTE5 FIRST USE: VerbOcean relations are used to calculate relatedness between verbs in T and H.
SECOND USE: used to assign relatedness between nominal predicates in T and H,after using WordNet to change the verbal nouns into verbs.
FIRST USE: Ablation test performed. Impact of the resource: null/+0.17% accuracy respectively on two-way and three-way task for run1; +0.33%/+0.5% for run2; +0.17%/+0.17% for run3.
SECOND USE (WordNet+VerbOcean): null/+0.17% accuracy respectively on two-way and three-way task for run1; +0.5%/+0.67% for run2; +0.17%/+0.17% for run3.
DLSIUAES RTE5 FIRST USE: Antonymy relation between verbs.

SECOND USE: VerbOcean relations used to find correspondence between verbs.

FIRST USE: Ablation test performed together with WordNet and DLSIUAES_negation_list. Positive impact on two-way run: +0.66% accuracy. Negative impact on three-way run: -1% (-0.5% for two-way derived).

SECOND USE: No ablation test performed.

FBKirst RTE5 Extraction of 18232 entailment rules for all the English verbs connected by the ”stronger-than” relation. For instance, if ”kill [stronger-than] injure”, then the rule ”kill ENTAILS injure” is added to the rules repository. Ablation test performed. Negative impact of the resource: -0.16% accuracy on two-way task.
QUANTA RTE5 We use "opposite-of" relation in VerbOcean as a feature Ablation test performed. Null impact of the resource on two-way task.
Siel_09 RTE5 Similarity/anthonymy/unrelatedness between verbs Ablation test performed. Null impact of the resource both on two-way and on three-way task.
UAIC RTE5 "Opposite-of" relation to detect contradiction. Used in combination with WordNet. Ablation test performed (Wordnet + VerbOcean). Positive impact of the two resources together: +2% accuracy on two-way, +1.5% on three-way task.
DFKI RTE4 Unrefined Semantic relation between verbs. No separate evaluation.
DLSIUAES RTE4 Data taken from the RTE4 proceedings. Participants are recommended to add further information.
UAIC RTE4 Data taken from the RTE4 proceedings. Participants are recommended to add further information.
UPC RTE4 Data taken from the RTE4 proceedings. Participants are recommended to add further information.
VENSES RTE3 Semantic relation between words No evaluation of the resource
New user Participants are encouraged to contribute.
Total: 11


[*] For further information about participants, click here: RTE Challenges - Data about participants

   Return to RTE Knowledge Resources