Difference between revisions of "RTE5 - Ablation Tests"
Jump to navigation
Jump to search
m (RTE Knowledge Resources vers1 moved to RTE5 - Ablation Tests) |
|||
Line 1: | Line 1: | ||
+ | {|class="wikitable sortable" cellpadding="3" cellspacing="0" border="1" | ||
+ | |- bgcolor="#CDCDCD" | ||
+ | ! Ablated Resource | ||
+ | ! Team Run | ||
+ | ! Relative accuracy - 2way | ||
+ | ! Relative accuracy - 3way | ||
+ | ! Resource Usage Description | ||
+ | |||
+ | |- bgcolor="#ECECEC" "align="left" | ||
+ | | Acronym guide | ||
+ | | Siel_093.3way | ||
+ | | style="text-align: right;"|0 | ||
+ | | style="text-align: right;"|0 | ||
+ | | Acronym Resolution | ||
+ | |||
+ | |- bgcolor="#ECECEC" "align="left" | ||
+ | | Acronym guide + <br>Acronym_rules by UAIC | ||
+ | | UAIC20091.3way | ||
+ | | style="text-align: right;"| +0.0017 | ||
+ | | style="text-align: right;"| +0.0016 | ||
+ | | We start from acronym-guide, but additional we use a rule that consider for expressions like Xaaaa Ybbbb Zcccc the acronym XYZ, regardless of length of text with this form. | ||
+ | |||
+ | |- bgcolor="#ECECEC" "align="left" | ||
+ | | DIRT | ||
+ | | BIU1.2way | ||
+ | | style="text-align: right;"| +0.0133 | ||
+ | | style="text-align: right;"| | ||
+ | | Inference rules | ||
+ | |||
+ | |- bgcolor="#ECECEC" "align="left" | ||
+ | | DIRT | ||
+ | | Boeing3.3way | ||
+ | | style="text-align: right;"| -0.0117 | ||
+ | | style="text-align: right;"| 0 | ||
+ | | | ||
+ | |||
+ | |- bgcolor="#ECECEC" "align="left" | ||
+ | | DIRT | ||
+ | | UAIC20091.3way | ||
+ | | style="text-align: right;"| +0,0017 | ||
+ | | style="text-align: right;"| +0,0033 | ||
+ | | We transform text and hypothesis with MINIPAR into dependency trees: use of DIRT relations to map verbs in T with verbs in H | ||
+ | |||
+ | |- bgcolor="#ECECEC" "align="left" | ||
+ | | Framenet | ||
+ | | DLSIUAES1.2way | ||
+ | | style="text-align: right;"| +0,0116 | ||
+ | | style="text-align: right;"| | ||
+ | | frame-to-frame similarity metric | ||
+ | |||
+ | |- bgcolor="#ECECEC" "align="left" | ||
+ | | Framenet | ||
+ | | DLSIUAES1.3way | ||
+ | | style="text-align: right;"| -0,0017 | ||
+ | | style="text-align: right;"| -0,0017 | ||
+ | | frame-to-frame similarity metric | ||
+ | |||
+ | |- bgcolor="#ECECEC" "align="left" | ||
+ | | Framenet | ||
+ | | UB.dmirg3.2way | ||
+ | | style="text-align: right;"| 0 | ||
+ | | style="text-align: right;"| | ||
+ | | | ||
+ | |||
+ | |- bgcolor="#ECECEC" "align="left" | ||
+ | | Grady Ward’s MOBY Thesaurus + <br>Roget's Thesaurus | ||
+ | | VensesTeam2.2way | ||
+ | | style="text-align: right;"| +0.0283 | ||
+ | | style="text-align: right;"| | ||
+ | | Semantic fields are used as semantic similarity matching, in all cases of non identical lemmas | ||
+ | |||
+ | |||
+ | |} |
Revision as of 07:08, 24 November 2009
Ablated Resource | Team Run | Relative accuracy - 2way | Relative accuracy - 3way | Resource Usage Description |
---|---|---|---|---|
Acronym guide | Siel_093.3way | 0 | 0 | Acronym Resolution |
Acronym guide + Acronym_rules by UAIC |
UAIC20091.3way | +0.0017 | +0.0016 | We start from acronym-guide, but additional we use a rule that consider for expressions like Xaaaa Ybbbb Zcccc the acronym XYZ, regardless of length of text with this form. |
DIRT | BIU1.2way | +0.0133 | Inference rules | |
DIRT | Boeing3.3way | -0.0117 | 0 | |
DIRT | UAIC20091.3way | +0,0017 | +0,0033 | We transform text and hypothesis with MINIPAR into dependency trees: use of DIRT relations to map verbs in T with verbs in H |
Framenet | DLSIUAES1.2way | +0,0116 | frame-to-frame similarity metric | |
Framenet | DLSIUAES1.3way | -0,0017 | -0,0017 | frame-to-frame similarity metric |
Framenet | UB.dmirg3.2way | 0 | ||
Grady Ward’s MOBY Thesaurus + Roget's Thesaurus |
VensesTeam2.2way | +0.0283 | Semantic fields are used as semantic similarity matching, in all cases of non identical lemmas
|