VerbOcean - RTE Users

From ACL Wiki
Revision as of 09:22, 1 December 2009 by Celct (talk | contribs)
Jump to navigation Jump to search

When not otherwise specified, the data about version, usage and evaluation of the resource have been provided by participants themselves.

Participants* Campaign Version Specific usage description Evaluations / Comments
DFKI RTE5 FIRST USE: VerbOcean relations are used to calculate relatedness between verbs in T and H.
SECOND USE: used to assign relatedness between nominal predicates in T and H,after using WordNet to change the verbal nouns into verbs.
FIRST USE: Ablation test performed. Impact of the resource: null/+0.17% accuracy respectively on two-way and three-way task for run1; +0.33%/+0.5% for run2; +0.17%/+0.17% for run3.
SECOND USE (WordNet+VerbOcean): null/+0.17% accuracy respectively on two-way and three-way task for run1; +0.5%/+0.67% for run2; +0.17%/+0.17% for run3.
DLSIUAES RTE5 FIRST USE: Antonymy relation between verbs.

SECOND USE: VerbOcean relations used to correspondence between verbs.

FIRST USE: Ablation test performed together with WordNet and DLSIUAES_negation_list. Positive impact on two-way run: +0.66% accuracy. Negative impact on three-way run: -1% (-0.5% for two-way derived).

SECOND USE: No ablation test performed.

DFKI RTE4 Unrefined Semantic relation between verbs. No separate evaluation.
DLSIUAES RTE4 Data taken from the RTE4 proceedings. Participants are recommended to add further information.
UAIC RTE4 Data taken from the RTE4 proceedings. Participants are recommended to add further information.
UPC RTE4 Data taken from the RTE4 proceedings. Participants are recommended to add further information.
VENSES RTE3 Semantic relation between words No evaluation of the resource
New user Participants are encouraged to contribute.
Total: 5


[*] For further information about participants, click here: RTE Challenges - Data about participants

   Return to RTE Knowledge Resources