Difference between revisions of "VerbOcean - RTE Users"
Jump to navigation
Jump to search
m |
|||
Line 31: | Line 31: | ||
| Extraction of 18232 entailment rules for all the English verbs connected by the ”stronger-than” relation. For instance, if ”kill [stronger-than] injure”, then the rule ”kill ENTAILS injure” is added to the rules repository. | | Extraction of 18232 entailment rules for all the English verbs connected by the ”stronger-than” relation. For instance, if ”kill [stronger-than] injure”, then the rule ”kill ENTAILS injure” is added to the rules repository. | ||
| Ablation test performed. Negative impact of the resource: -0.16% accuracy on two-way task. | | Ablation test performed. Negative impact of the resource: -0.16% accuracy on two-way task. | ||
+ | |||
+ | |- bgcolor="#ECECEC" align="left" | ||
+ | | QUANTA | ||
+ | | RTE5 | ||
+ | | | ||
+ | | We use "opposite-of" relation in VerbOcean as a feature | ||
+ | | Ablation test performed. Null impact of the resource on two-way task. | ||
|- bgcolor="#ECECEC" align="left" | |- bgcolor="#ECECEC" align="left" |
Revision as of 07:24, 10 December 2009
When not otherwise specified, the data about version, usage and evaluation of the resource have been provided by participants themselves.
Participants* | Campaign | Version | Specific usage description | Evaluations / Comments |
---|---|---|---|---|
DFKI | RTE5 | FIRST USE: VerbOcean relations are used to calculate relatedness between verbs in T and H. SECOND USE: used to assign relatedness between nominal predicates in T and H,after using WordNet to change the verbal nouns into verbs. |
FIRST USE: Ablation test performed. Impact of the resource: null/+0.17% accuracy respectively on two-way and three-way task for run1; +0.33%/+0.5% for run2; +0.17%/+0.17% for run3. SECOND USE (WordNet+VerbOcean): null/+0.17% accuracy respectively on two-way and three-way task for run1; +0.5%/+0.67% for run2; +0.17%/+0.17% for run3. | |
DLSIUAES | RTE5 | FIRST USE: Antonymy relation between verbs. SECOND USE: VerbOcean relations used to find correspondence between verbs. |
FIRST USE: Ablation test performed together with WordNet and DLSIUAES_negation_list. Positive impact on two-way run: +0.66% accuracy. Negative impact on three-way run: -1% (-0.5% for two-way derived). SECOND USE: No ablation test performed. | |
FBKirst | RTE5 | Extraction of 18232 entailment rules for all the English verbs connected by the ”stronger-than” relation. For instance, if ”kill [stronger-than] injure”, then the rule ”kill ENTAILS injure” is added to the rules repository. | Ablation test performed. Negative impact of the resource: -0.16% accuracy on two-way task. | |
QUANTA | RTE5 | We use "opposite-of" relation in VerbOcean as a feature | Ablation test performed. Null impact of the resource on two-way task. | |
DFKI | RTE4 | Unrefined | Semantic relation between verbs. | No separate evaluation. |
DLSIUAES | RTE4 | Data taken from the RTE4 proceedings. Participants are recommended to add further information. | ||
UAIC | RTE4 | Data taken from the RTE4 proceedings. Participants are recommended to add further information. | ||
UPC | RTE4 | Data taken from the RTE4 proceedings. Participants are recommended to add further information. | ||
VENSES | RTE3 | Semantic relation between words | No evaluation of the resource | |
New user | Participants are encouraged to contribute. |
Total: 5 |
---|
[*] For further information about participants, click here: RTE Challenges - Data about participants
Return to RTE Knowledge Resources