DIRT Paraphrase Collection - RTE Users

From ACL Wiki
Revision as of 11:32, 22 December 2009 by Celct (Talk | contribs)

(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)
Jump to: navigation, search


Participants* Campaign Version Specific usage description Evaluations / Comments
BIU RTE5 We used the canonical DIRT rulebase version of Szpektor and Dagan (RANLP 2007), and considered top 25 rules. Ablation test performed. Positive impact of the resource: +1.33% accuracy on two-way task.
Boeing RTE5 Verb paraphrases Ablation test performed. Negative impact of the resource on two-way task: -1.17% accuracy. Null impact of the resource on three-way task.
UAIC RTE5 Use of DIRT relations to map verbs in T with verbs in H Ablation test performed. Positive impact of the resource: +0.17% accuracy on two-way, +0.33% on three-way task.
BIU RTE4 We used the canonical DIRT rulebase version of Szpektor and Dagan (RANLP 2007), and considered top 25 rules. +0.9% on RTE-4 ablation tests
Boeing RTE4 Original DIRT db Elaborate T sentence with DIRT-implied entailments precision/recall in RTE4:

boeing run1: 67%/6%;
boeing run2: 54%/30%

UAIC RTE4 Use of DIRT relations to map verbs in T with verbs in H Ablation test performed: +0.7% precision on two-way task.
Uoeltg RTE4 Data taken from the RTE4 proceedings. Participants are recommended to add further information.
UAIC RTE3 Use of DIRT relations to map verbs in T with verbs in H Ablation test performed: +0.37% precision on two-way task.
UIUC RTE3 Paired verb/argument patterns
Total: 9


[*] For further information about participants, click here: RTE Challenges - Data about participants

   Return to RTE Knowledge Resources