Difference between revisions of "DIRT Paraphrase Collection - RTE Users"
Jump to navigation
Jump to search
m |
m |
||
Line 1: | Line 1: | ||
When not otherwise specified, the data about version, usage and evaluation of the resource have been provided by participants themselves.<br> | When not otherwise specified, the data about version, usage and evaluation of the resource have been provided by participants themselves.<br> | ||
<br /> | <br /> | ||
− | {|class="wikitable sortable" cellpadding="3" cellspacing="0" border="1 | + | {|class="wikitable sortable" cellpadding="3" cellspacing="0" border="1" |
|- bgcolor="#CDCDCD" | |- bgcolor="#CDCDCD" | ||
! nowrap="nowrap"|Participants* | ! nowrap="nowrap"|Participants* | ||
Line 54: | Line 54: | ||
|- bgcolor="#ECECEC" align="left" | |- bgcolor="#ECECEC" align="left" | ||
− | | | + | | Uoeltg |
− | | | + | | RTE4 |
| | | | ||
− | |||
| | | | ||
+ | | ''Data taken from the RTE4 proceedings. Participants are recommended to add further information.'' | ||
|- bgcolor="#ECECEC" align="left" | |- bgcolor="#ECECEC" align="left" | ||
− | | | + | | UIUC |
− | | | + | | RTE3 |
| | | | ||
+ | | Paired verb/argument patterns | ||
| | | | ||
− | |||
|- bgcolor="#ECECEC" align="left" | |- bgcolor="#ECECEC" align="left" | ||
Line 77: | Line 77: | ||
{|class="wikitable" cellpadding="3" cellspacing="0" border="0" style="margin-left: 20px;" | {|class="wikitable" cellpadding="3" cellspacing="0" border="0" style="margin-left: 20px;" | ||
|- | |- | ||
− | ! align="left"|Total: | + | ! align="left"|Total: 8 |
|} | |} | ||
<br> | <br> |
Revision as of 08:49, 21 December 2009
When not otherwise specified, the data about version, usage and evaluation of the resource have been provided by participants themselves.
Participants* | Campaign | Version | Specific usage description | Evaluations / Comments |
---|---|---|---|---|
BIU | RTE5 | We used the canonical DIRT rulebase version of Szpektor and Dagan (RANLP 2007), and considered top 25 rules. | Ablation test performed. Positive impact of the resource: +1.33% accuracy on two-way task. | |
Boeing | RTE5 | Verb paraphrases | Ablation test performed. Negative impact of the resource on two-way task: -1.17% accuracy. Null impact of the resource on three-way task. | |
UAIC | RTE5 | Use of DIRT relations to map verbs in T with verbs in H | Ablation test performed. Positive impact of the resource: +0.17% accuracy on two-way, +0.33% on three-way task. | |
BIU | RTE4 | We used the canonical DIRT rulebase version of Szpektor and Dagan (RANLP 2007), and considered top 25 rules. | +0.9% on RTE-4 ablation tests | |
Boeing | RTE4 | Original DIRT db | Elaborate T sentence with DIRT-implied entailments | precision/recall in RTE4: boeing run1: 67%/6%; |
UAIC | RTE4 | Data taken from the RTE4 proceedings. Participants are recommended to add further information. | ||
Uoeltg | RTE4 | Data taken from the RTE4 proceedings. Participants are recommended to add further information. | ||
UIUC | RTE3 | Paired verb/argument patterns | ||
New user | Participants are encouraged to contribute. |
Total: 8 |
---|
[*] For further information about participants, click here: RTE Challenges - Data about participants
Return to RTE Knowledge Resources