Difference between revisions of "DIRT Paraphrase Collection - RTE Users"
Jump to navigation
Jump to search
(18 intermediate revisions by the same user not shown) | |||
Line 1: | Line 1: | ||
<br /> | <br /> | ||
− | {|class="wikitable sortable" cellpadding="3" cellspacing="0" border="1 | + | {|class="wikitable sortable" cellpadding="3" cellspacing="0" border="1" |
|- bgcolor="#CDCDCD" | |- bgcolor="#CDCDCD" | ||
− | ! nowrap="nowrap"| | + | ! nowrap="nowrap"|Participants* |
! nowrap="nowrap"|Campaign | ! nowrap="nowrap"|Campaign | ||
! nowrap="nowrap"|Version | ! nowrap="nowrap"|Version | ||
! nowrap="nowrap"|Specific usage description | ! nowrap="nowrap"|Specific usage description | ||
− | ! nowrap="nowrap"| | + | ! nowrap="nowrap"|Evaluations / Comments |
+ | |||
+ | |- bgcolor="#ECECEC" align="left" | ||
+ | | BIU | ||
+ | | RTE5 | ||
+ | | | ||
+ | | We used the canonical DIRT rulebase version of Szpektor and Dagan (RANLP 2007), and considered top 25 rules. | ||
+ | | Ablation test performed. Positive impact of the resource: +1.33% accuracy on two-way task. | ||
+ | |||
|- bgcolor="#ECECEC" align="left" | |- bgcolor="#ECECEC" align="left" | ||
| Boeing | | Boeing | ||
− | | | + | | RTE5 |
− | | | + | | |
− | | | + | | Verb paraphrases |
− | | | + | | Ablation test performed. Negative impact of the resource on two-way task: -1.17% accuracy. Null impact of the resource on three-way task. |
− | + | ||
− | + | |- bgcolor="#ECECEC" align="left" | |
+ | | UAIC | ||
+ | | RTE5 | ||
+ | | | ||
+ | | Use of DIRT relations to map verbs in T with verbs in H | ||
+ | | Ablation test performed. Positive impact of the resource: +0.17% accuracy on two-way, +0.33% on three-way task. | ||
+ | |||
|- bgcolor="#ECECEC" align="left" | |- bgcolor="#ECECEC" align="left" | ||
| BIU | | BIU | ||
Line 20: | Line 34: | ||
| | | | ||
| We used the canonical DIRT rulebase version of Szpektor and Dagan (RANLP 2007), and considered top 25 rules. | | We used the canonical DIRT rulebase version of Szpektor and Dagan (RANLP 2007), and considered top 25 rules. | ||
− | | | + | | +0.9% on RTE-4 ablation tests |
+ | |||
+ | |- bgcolor="#ECECEC" align="left" | ||
+ | | Boeing | ||
+ | | RTE4 | ||
+ | | Original DIRT db | ||
+ | | Elaborate T sentence with DIRT-implied entailments | ||
+ | | precision/recall in RTE4:<br> | ||
+ | boeing run1: 67%/6%;<br> | ||
+ | boeing run2: 54%/30% | ||
+ | |||
|- bgcolor="#ECECEC" align="left" | |- bgcolor="#ECECEC" align="left" | ||
| UAIC | | UAIC | ||
| RTE4 | | RTE4 | ||
| | | | ||
− | | | + | | Use of DIRT relations to map verbs in T with verbs in H |
− | | | + | | Ablation test performed: +0.7% precision on two-way task. |
+ | |||
|- bgcolor="#ECECEC" align="left" | |- bgcolor="#ECECEC" align="left" | ||
| Uoeltg | | Uoeltg | ||
Line 32: | Line 57: | ||
| | | | ||
| | | | ||
− | | ''Data | + | | ''Data taken from the RTE4 proceedings. Participants are recommended to add further information.'' |
+ | |||
|- bgcolor="#ECECEC" align="left" | |- bgcolor="#ECECEC" align="left" | ||
− | | | + | | UAIC |
+ | | RTE3 | ||
| | | | ||
+ | | Use of DIRT relations to map verbs in T with verbs in H | ||
+ | | Ablation test performed: +0.37% precision on two-way task. | ||
+ | |||
+ | |- bgcolor="#ECECEC" align="left" | ||
+ | | UIUC | ||
+ | | RTE3 | ||
| | | | ||
+ | | Paired verb/argument patterns | ||
| | | | ||
− | + | ||
|} | |} | ||
{|class="wikitable" cellpadding="3" cellspacing="0" border="0" style="margin-left: 20px;" | {|class="wikitable" cellpadding="3" cellspacing="0" border="0" style="margin-left: 20px;" | ||
|- | |- | ||
− | ! align="left"|Total: | + | ! align="left"|Total: 9 |
|} | |} | ||
+ | <br> | ||
+ | |||
+ | [*] For further information about participants, click here: [[RTE Challenges - Data about participants]] | ||
+ | |||
+ | Return to [[RTE Knowledge Resources]] |
Latest revision as of 10:32, 22 December 2009
Participants* | Campaign | Version | Specific usage description | Evaluations / Comments |
---|---|---|---|---|
BIU | RTE5 | We used the canonical DIRT rulebase version of Szpektor and Dagan (RANLP 2007), and considered top 25 rules. | Ablation test performed. Positive impact of the resource: +1.33% accuracy on two-way task. | |
Boeing | RTE5 | Verb paraphrases | Ablation test performed. Negative impact of the resource on two-way task: -1.17% accuracy. Null impact of the resource on three-way task. | |
UAIC | RTE5 | Use of DIRT relations to map verbs in T with verbs in H | Ablation test performed. Positive impact of the resource: +0.17% accuracy on two-way, +0.33% on three-way task. | |
BIU | RTE4 | We used the canonical DIRT rulebase version of Szpektor and Dagan (RANLP 2007), and considered top 25 rules. | +0.9% on RTE-4 ablation tests | |
Boeing | RTE4 | Original DIRT db | Elaborate T sentence with DIRT-implied entailments | precision/recall in RTE4: boeing run1: 67%/6%; |
UAIC | RTE4 | Use of DIRT relations to map verbs in T with verbs in H | Ablation test performed: +0.7% precision on two-way task. | |
Uoeltg | RTE4 | Data taken from the RTE4 proceedings. Participants are recommended to add further information. | ||
UAIC | RTE3 | Use of DIRT relations to map verbs in T with verbs in H | Ablation test performed: +0.37% precision on two-way task. | |
UIUC | RTE3 | Paired verb/argument patterns |
Total: 9 |
---|
[*] For further information about participants, click here: RTE Challenges - Data about participants
Return to RTE Knowledge Resources