Difference between revisions of "DIRT Paraphrase Collection - RTE Users"

From ACL Wiki
Jump to navigation Jump to search
m
m
Line 1: Line 1:
When not otherwise specified, the data about version, usage and evaluation of the resource have been provided by participants themselves.<br>
 
 
<br />
 
<br />
 
{|class="wikitable sortable" cellpadding="3" cellspacing="0" border="1"
 
{|class="wikitable sortable" cellpadding="3" cellspacing="0" border="1"

Revision as of 06:21, 22 December 2009


Participants* Campaign Version Specific usage description Evaluations / Comments
BIU RTE5 We used the canonical DIRT rulebase version of Szpektor and Dagan (RANLP 2007), and considered top 25 rules. Ablation test performed. Positive impact of the resource: +1.33% accuracy on two-way task.
Boeing RTE5 Verb paraphrases Ablation test performed. Negative impact of the resource on two-way task: -1.17% accuracy. Null impact of the resource on three-way task.
UAIC RTE5 Use of DIRT relations to map verbs in T with verbs in H Ablation test performed. Positive impact of the resource: +0.17% accuracy on two-way, +0.33% on three-way task.
BIU RTE4 We used the canonical DIRT rulebase version of Szpektor and Dagan (RANLP 2007), and considered top 25 rules. +0.9% on RTE-4 ablation tests
Boeing RTE4 Original DIRT db Elaborate T sentence with DIRT-implied entailments precision/recall in RTE4:

boeing run1: 67%/6%;
boeing run2: 54%/30%

UAIC RTE4 Data taken from the RTE4 proceedings. Participants are recommended to add further information.
Uoeltg RTE4 Data taken from the RTE4 proceedings. Participants are recommended to add further information.
UIUC RTE3 Paired verb/argument patterns
New user Participants are encouraged to contribute.
Total: 8


[*] For further information about participants, click here: RTE Challenges - Data about participants

   Return to RTE Knowledge Resources