Difference between revisions of "DIRT Paraphrase Collection - RTE Users"
Jump to navigation
Jump to search
m |
m |
||
Line 44: | Line 44: | ||
! align="left"|Total: 4 | ! align="left"|Total: 4 | ||
|} | |} | ||
+ | <br> | ||
+ | Return to [[RTE Knowledge Resources]] |
Revision as of 01:14, 8 April 2009
Participants | Campaign | Version | Specific usage description | Evalutations / Comments |
---|---|---|---|---|
Boeing | RTE4 | Original DIRT db | Elaborate T sentence with DIRT-implied entailments | precision/recall in RTE4:
boeing run1: 67%/6%; boeing run2: 54%/30% |
BIU | RTE4 | We used the canonical DIRT rulebase version of Szpektor and Dagan (RANLP 2007), and considered top 25 rules. | Verb-net + Nom-lex plus + Parc Polarity Lexicon + DIRT Paraphrase Collection = +0.9% on RTE-4 ablation tests. | |
UAIC | RTE4 | Data taken from the RTE4 proceedings. Participants are recommended to add further information. | ||
Uoeltg | RTE4 | Data taken from the RTE4 proceedings. Participants are recommended to add further information. | ||
New user | Participants are encouraged to contribute. |
Total: 4 |
---|
Return to RTE Knowledge Resources