Difference between revisions of "DIRT Paraphrase Collection - RTE Users"
Jump to navigation
Jump to search
m |
|||
Line 45: | Line 45: | ||
|} | |} | ||
<br> | <br> | ||
+ | |||
+ | [*] For further information about participants, click here: [[RTE Challenges - Data about participants]] | ||
+ | |||
Return to [[RTE Knowledge Resources]] | Return to [[RTE Knowledge Resources]] |
Revision as of 02:40, 22 April 2009
Participants | Campaign | Version | Specific usage description | Evalutations / Comments |
---|---|---|---|---|
Boeing | RTE4 | Original DIRT db | Elaborate T sentence with DIRT-implied entailments | precision/recall in RTE4: boeing run1: 67%/6%; |
BIU | RTE4 | We used the canonical DIRT rulebase version of Szpektor and Dagan (RANLP 2007), and considered top 25 rules. | Verb-net + Nom-lex plus + Parc Polarity Lexicon + DIRT Paraphrase Collection = +0.9% on RTE-4 ablation tests. | |
UAIC | RTE4 | Data taken from the RTE4 proceedings. Participants are recommended to add further information. | ||
Uoeltg | RTE4 | Data taken from the RTE4 proceedings. Participants are recommended to add further information. | ||
New user | Participants are encouraged to contribute. |
Total: 4 |
---|
[*] For further information about participants, click here: RTE Challenges - Data about participants
Return to RTE Knowledge Resources