Difference between revisions of "RTE5 - Ablation Tests"
Jump to navigation
Jump to search
m |
m |
||
Line 4: | Line 4: | ||
! Ablated Resource | ! Ablated Resource | ||
! Team Run | ! Team Run | ||
− | ! <small> | + | ! <small>Δ Accuracy % - 2way</small> |
− | ! <small> | + | ! <small>Δ Accuracy % - 3way</small> |
! Resource Usage Description | ! Resource Usage Description | ||
Line 18: | Line 18: | ||
| Acronym guide + <br>UAIC_Acronym_rules | | Acronym guide + <br>UAIC_Acronym_rules | ||
| UAIC20091.3way | | UAIC20091.3way | ||
− | | style="text-align: center;"| | + | | style="text-align: center;"| 0.17 |
− | | style="text-align: center;"| | + | | style="text-align: center;"| 0.16 |
| We start from acronym-guide, but additional we use a rule that consider for expressions like Xaaaa Ybbbb Zcccc the acronym XYZ, regardless of length of text with this form. | | We start from acronym-guide, but additional we use a rule that consider for expressions like Xaaaa Ybbbb Zcccc the acronym XYZ, regardless of length of text with this form. | ||
Line 25: | Line 25: | ||
| DIRT | | DIRT | ||
| BIU1.2way | | BIU1.2way | ||
− | | style="text-align: center;"| | + | | style="text-align: center;"| 1.33 |
| style="text-align: center;"| | | style="text-align: center;"| | ||
| Inference rules | | Inference rules | ||
Line 39: | Line 39: | ||
| DIRT | | DIRT | ||
| UAIC20091.3way | | UAIC20091.3way | ||
− | | style="text-align: center;"| | + | | style="text-align: center;"| 0.17 |
− | | style="text-align: center;"| | + | | style="text-align: center;"| 0.33 |
| We transform text and hypothesis with MINIPAR into dependency trees: use of DIRT relations to map verbs in T with verbs in H | | We transform text and hypothesis with MINIPAR into dependency trees: use of DIRT relations to map verbs in T with verbs in H | ||
Line 46: | Line 46: | ||
| Framenet | | Framenet | ||
| DLSIUAES1.2way | | DLSIUAES1.2way | ||
− | | style="text-align: center;"| | + | | style="text-align: center;"| 1.16 |
| style="text-align: center;"| | | style="text-align: center;"| | ||
| frame-to-frame similarity metric | | frame-to-frame similarity metric | ||
Line 67: | Line 67: | ||
| Grady Ward’s MOBY Thesaurus + <br>Roget's Thesaurus | | Grady Ward’s MOBY Thesaurus + <br>Roget's Thesaurus | ||
| VensesTeam2.2way | | VensesTeam2.2way | ||
− | | style="text-align: center;"| | + | | style="text-align: center;"| 2.83 |
| style="text-align: center;"| | | style="text-align: center;"| | ||
| Semantic fields are used as semantic similarity matching, in all cases of non identical lemmas | | Semantic fields are used as semantic similarity matching, in all cases of non identical lemmas | ||
Line 88: | Line 88: | ||
| NER | | NER | ||
| UI_ccg1.2way | | UI_ccg1.2way | ||
− | | style="text-align: center;"| | + | | style="text-align: center;"| 4.83 |
| style="text-align: center;"| | | style="text-align: center;"| | ||
| Named Entity recognition/comparison | | Named Entity recognition/comparison | ||
Line 95: | Line 95: | ||
| PropBank | | PropBank | ||
| cswhu1.3way | | cswhu1.3way | ||
− | | style="text-align: center;"| | + | | style="text-align: center;"| 2 |
− | | style="text-align: center;"| | + | | style="text-align: center;"| 3.17 |
| syntactic and semantic parsing | | syntactic and semantic parsing | ||
Line 102: | Line 102: | ||
| Stanford NER | | Stanford NER | ||
| QUANTA1.2way | | QUANTA1.2way | ||
− | | style="text-align: center;"| | + | | style="text-align: center;"| 0.67 |
| style="text-align: center;"| | | style="text-align: center;"| | ||
| We use Named Entity similarity as a feature | | We use Named Entity similarity as a feature | ||
Line 109: | Line 109: | ||
| Stopword list | | Stopword list | ||
| FBKirst1.2way | | FBKirst1.2way | ||
− | | style="text-align: center;"| | + | | style="text-align: center;"| 1.5 |
| style="text-align: center;"| -10.28 | | style="text-align: center;"| -10.28 | ||
| | | | ||
Line 130: | Line 130: | ||
| Training data from RTE2 | | Training data from RTE2 | ||
| PeMoZa3.2way | | PeMoZa3.2way | ||
− | | style="text-align: center;"| | + | | style="text-align: center;"| 0.66 |
| style="text-align: center;"| | | style="text-align: center;"| | ||
| | | | ||
Line 145: | Line 145: | ||
| DFKI1.3way | | DFKI1.3way | ||
| style="text-align: center;"| 0 | | style="text-align: center;"| 0 | ||
− | | style="text-align: center;"| | + | | style="text-align: center;"| 0.17 |
| | | | ||
Line 151: | Line 151: | ||
| VerbOcean | | VerbOcean | ||
| DFKI2.3way | | DFKI2.3way | ||
− | | style="text-align: center;"| | + | | style="text-align: center;"| 0.33 |
− | | style="text-align: center;"| | + | | style="text-align: center;"| 0.5 |
| | | | ||
Line 158: | Line 158: | ||
| VerbOcean | | VerbOcean | ||
| DFKI3.3way | | DFKI3.3way | ||
− | | style="text-align: center;"| | + | | style="text-align: center;"| 0.17 |
− | | style="text-align: center;"| | + | | style="text-align: center;"| 0.17 |
| | | | ||
Line 193: | Line 193: | ||
| WikiPedia | | WikiPedia | ||
| cswhu1.3way | | cswhu1.3way | ||
− | | style="text-align: center;"| | + | | style="text-align: center;"| 1.33 |
− | | style="text-align: center;"| | + | | style="text-align: center;"| 3.34 |
| Lexical semantic rules | | Lexical semantic rules | ||
Line 200: | Line 200: | ||
| WikiPedia | | WikiPedia | ||
| FBKirst1.2way | | FBKirst1.2way | ||
− | | style="text-align: center;"| | + | | style="text-align: center;"| 1 |
| style="text-align: center;"| | | style="text-align: center;"| | ||
| Rules extracted from WP using Latent Semantic Analysis (LSA) | | Rules extracted from WP using Latent Semantic Analysis (LSA) | ||
Line 207: | Line 207: | ||
| WikiPedia | | WikiPedia | ||
| UAIC20091.3way | | UAIC20091.3way | ||
− | | style="text-align: center;"| | + | | style="text-align: center;"| 1.17 |
− | | style="text-align: center;"| | + | | style="text-align: center;"| 1.5 |
| Relations between named entities | | Relations between named entities | ||
Line 214: | Line 214: | ||
| Wikipedia + <br>NER's (LingPipe, GATE) + <br>Perl patterns | | Wikipedia + <br>NER's (LingPipe, GATE) + <br>Perl patterns | ||
| UAIC20091.3way | | UAIC20091.3way | ||
− | | style="text-align: center;"| | + | | style="text-align: center;"| 6.17 |
− | | style="text-align: center;"| | + | | style="text-align: center;"| 5 |
| NE module: NERs, in order to identify Persons, Locations, Jobs, Languages, etc; Perl patterns built by us for RTE4 in order to identify numbers and dates; our own resources extracted from Wikipedia in order to identify a "distance" between one name entity from hypothesis and name entities from text | | NE module: NERs, in order to identify Persons, Locations, Jobs, Languages, etc; Perl patterns built by us for RTE4 in order to identify numbers and dates; our own resources extracted from Wikipedia in order to identify a "distance" between one name entity from hypothesis and name entities from text | ||
Line 228: | Line 228: | ||
| WordNet | | WordNet | ||
| BIU1.2way | | BIU1.2way | ||
− | | style="text-align: center;"| | + | | style="text-align: center;"| 2.5 |
| style="text-align: center;"| | | style="text-align: center;"| | ||
| Synonyms, hyponyms (2 levels away from the original term), hyponym_instance and derivations | | Synonyms, hyponyms (2 levels away from the original term), hyponym_instance and derivations | ||
Line 235: | Line 235: | ||
| WordNet | | WordNet | ||
| Boeing3.3way | | Boeing3.3way | ||
− | | style="text-align: center;"| | + | | style="text-align: center;"| 4 |
− | | style="text-align: center;"| | + | | style="text-align: center;"| 5.67 |
| | | | ||
Line 249: | Line 249: | ||
| WordNet | | WordNet | ||
| DFKI2.3way | | DFKI2.3way | ||
− | | style="text-align: center;"| | + | | style="text-align: center;"| 0.16 |
− | | style="text-align: center;"| | + | | style="text-align: center;"| 0.34 |
| | | | ||
Line 256: | Line 256: | ||
| WordNet | | WordNet | ||
| DFKI3.3way | | DFKI3.3way | ||
− | | style="text-align: center;"| | + | | style="text-align: center;"| 0.17 |
− | | style="text-align: center;"| | + | | style="text-align: center;"| 0.17 |
| | | | ||
Line 263: | Line 263: | ||
| WordNet | | WordNet | ||
| DLSIUAES1.2way | | DLSIUAES1.2way | ||
− | | style="text-align: center;"| | + | | style="text-align: center;"| 0.83 |
| style="text-align: center;"| | | style="text-align: center;"| | ||
| Similarity between lemmata, computed by WordNet-based metrics | | Similarity between lemmata, computed by WordNet-based metrics | ||
Line 271: | Line 271: | ||
| DLSIUAES1.3way | | DLSIUAES1.3way | ||
| style="text-align: center;"| -0.5 | | style="text-align: center;"| -0.5 | ||
− | | style="text-align: center;"| | + | | style="text-align: center;"| −0.33 |
| Similarity between lemmata, computed by WordNet-based metrics | | Similarity between lemmata, computed by WordNet-based metrics | ||
Line 277: | Line 277: | ||
| WordNet | | WordNet | ||
| JU_CSE_TAC1.2way | | JU_CSE_TAC1.2way | ||
− | | style="text-align: center;"| | + | | style="text-align: center;"| 0.34 |
| style="text-align: center;"| | | style="text-align: center;"| | ||
| WordNet based Unigram match | | WordNet based Unigram match | ||
Line 291: | Line 291: | ||
| WordNet | | WordNet | ||
| PeMoZa1.2way | | PeMoZa1.2way | ||
− | | style="text-align: center;"| | + | | style="text-align: center;"| 1.33 |
| style="text-align: center;"| | | style="text-align: center;"| | ||
| Verb Entailment from Wordnet | | Verb Entailment from Wordnet | ||
Line 298: | Line 298: | ||
| WordNet | | WordNet | ||
| PeMoZa2.2way | | PeMoZa2.2way | ||
− | | style="text-align: center;"| | + | | style="text-align: center;"| 1 |
| style="text-align: center;"| | | style="text-align: center;"| | ||
| Derivational Morphology from WordNet | | Derivational Morphology from WordNet | ||
Line 327: | Line 327: | ||
| WordNet | | WordNet | ||
| Siel_093.3way | | Siel_093.3way | ||
− | | style="text-align: center;"| | + | | style="text-align: center;"| 0.34 |
| style="text-align: center;"| -0.17 | | style="text-align: center;"| -0.17 | ||
| Similarity between nouns using WN tool | | Similarity between nouns using WN tool | ||
Line 335: | Line 335: | ||
| ssl1.3way | | ssl1.3way | ||
| style="text-align: center;"| 0 | | style="text-align: center;"| 0 | ||
− | | style="text-align: center;"| | + | | style="text-align: center;"| 0.67 |
| WordNet Analysis | | WordNet Analysis | ||
Line 348: | Line 348: | ||
| WordNet | | WordNet | ||
| UI_ccg1.2way | | UI_ccg1.2way | ||
− | | style="text-align: center;"| | + | | style="text-align: center;"| 4 |
| style="text-align: center;"| | | style="text-align: center;"| | ||
| word similarity == identity | | word similarity == identity | ||
Line 363: | Line 363: | ||
| DFKI1.3way | | DFKI1.3way | ||
| style="text-align: center;"| 0 | | style="text-align: center;"| 0 | ||
− | | style="text-align: center;"| | + | | style="text-align: center;"| 0.17 |
| | | | ||
Line 369: | Line 369: | ||
| WordNet +<br>VerbOcean | | WordNet +<br>VerbOcean | ||
| DFKI2.3way | | DFKI2.3way | ||
− | | style="text-align: center;"| | + | | style="text-align: center;"| 0.5 |
− | | style="text-align: center;"| | + | | style="text-align: center;"| 0.67 |
| | | | ||
Line 376: | Line 376: | ||
| WordNet +<br>VerbOcean | | WordNet +<br>VerbOcean | ||
| DFKI3.3way | | DFKI3.3way | ||
− | | style="text-align: center;"| | + | | style="text-align: center;"| 0.17 |
− | | style="text-align: center;"| | + | | style="text-align: center;"| 0.17 |
| | | | ||
Line 383: | Line 383: | ||
| WordNet +<br>VerbOcean | | WordNet +<br>VerbOcean | ||
| UAIC20091.3way | | UAIC20091.3way | ||
− | | style="text-align: center;"| | + | | style="text-align: center;"| 2 |
− | | style="text-align: center;"| | + | | style="text-align: center;"| 1.50 |
| Contradiction identification | | Contradiction identification | ||
Line 390: | Line 390: | ||
| WordNet +<br>VerbOcean + <br>DLSIUAES_negation_list | | WordNet +<br>VerbOcean + <br>DLSIUAES_negation_list | ||
| DLSIUAES1.2way | | DLSIUAES1.2way | ||
− | | style="text-align: center;"| | + | | style="text-align: center;"| 0.66 |
| style="text-align: center;"| | | style="text-align: center;"| | ||
| Antonym relations between verbs (VO+WN); polarity based on negation terms (short list constructed by ourselves) | | Antonym relations between verbs (VO+WN); polarity based on negation terms (short list constructed by ourselves) | ||
Line 404: | Line 404: | ||
| WordNet +<br>XWordNet | | WordNet +<br>XWordNet | ||
| UAIC20091.3way | | UAIC20091.3way | ||
− | | style="text-align: center;"| | + | | style="text-align: center;"| 1 |
− | | style="text-align: center;"| | + | | style="text-align: center;"| 1.33 |
| Synonymy, hyponymy and hypernymy and eXtended WordNet relation | | Synonymy, hyponymy and hypernymy and eXtended WordNet relation | ||
|} | |} |
Revision as of 07:21, 30 November 2009
Ablated Resource | Team Run | Δ Accuracy % - 2way | Δ Accuracy % - 3way | Resource Usage Description |
---|---|---|---|---|
Acronym guide | Siel_093.3way | 0 | 0 | Acronym Resolution |
Acronym guide + UAIC_Acronym_rules |
UAIC20091.3way | 0.17 | 0.16 | We start from acronym-guide, but additional we use a rule that consider for expressions like Xaaaa Ybbbb Zcccc the acronym XYZ, regardless of length of text with this form. |
DIRT | BIU1.2way | 1.33 | Inference rules | |
DIRT | Boeing3.3way | -1.17 | 0 | |
DIRT | UAIC20091.3way | 0.17 | 0.33 | We transform text and hypothesis with MINIPAR into dependency trees: use of DIRT relations to map verbs in T with verbs in H |
Framenet | DLSIUAES1.2way | 1.16 | frame-to-frame similarity metric | |
Framenet | DLSIUAES1.3way | -0.17 | -0.17 | frame-to-frame similarity metric |
Framenet | UB.dmirg3.2way | 0 | ||
Grady Ward’s MOBY Thesaurus + Roget's Thesaurus |
VensesTeam2.2way | 2.83 | Semantic fields are used as semantic similarity matching, in all cases of non identical lemmas | |
MontyLingua Tool | Siel_093.3way | 0 | 0 | For the VerbOcean, the verbs have to be in the base form. We used the "MontyLingua" tool to convert the verbs into their base form |
NEGATION_rules by UAIC | UAIC20091.3way | 0 | -1.34 | Negation rules check in the dependency trees on verbs descending branches to see if some categories of words that change the meaning are found. |
NER | UI_ccg1.2way | 4.83 | Named Entity recognition/comparison | |
PropBank | cswhu1.3way | 2 | 3.17 | syntactic and semantic parsing |
Stanford NER | QUANTA1.2way | 0.67 | We use Named Entity similarity as a feature | |
Stopword list | FBKirst1.2way | 1.5 | -10.28 | |
Training data from RTE1, 2, 3 | PeMoZa3.2way | 0 | ||
Training data from RTE1, 2, 3 | PeMoZa3.2way | 0 | ||
Training data from RTE2 | PeMoZa3.2way | 0.66 | ||
Training data from RTE2, 3 | PeMoZa3.2way | 0 | ||
VerbOcean | DFKI1.3way | 0 | 0.17 | |
VerbOcean | DFKI2.3way | 0.33 | 0.5 | |
VerbOcean | DFKI3.3way | 0.17 | 0.17 | |
VerbOcean | FBKirst1.2way | -0.16 | -10.28 | Rules extracted from VerbOcean |
VerbOcean | QUANTA1.2way | 0 | We use "opposite-of" relation in VerbOcean as a feature | |
VerbOcean | Siel_093.3way | 0 | 0 | Similarity/anthonymy/unrelatedness between verbs |
WikiPedia | BIU1.2way | -1 | Lexical rules extracted from Wikipedia definition sentences, title parenthesis, redirect and hyperlink relations | |
WikiPedia | cswhu1.3way | 1.33 | 3.34 | Lexical semantic rules |
WikiPedia | FBKirst1.2way | 1 | Rules extracted from WP using Latent Semantic Analysis (LSA) | |
WikiPedia | UAIC20091.3way | 1.17 | 1.5 | Relations between named entities |
Wikipedia + NER's (LingPipe, GATE) + Perl patterns |
UAIC20091.3way | 6.17 | 5 | NE module: NERs, in order to identify Persons, Locations, Jobs, Languages, etc; Perl patterns built by us for RTE4 in order to identify numbers and dates; our own resources extracted from Wikipedia in order to identify a "distance" between one name entity from hypothesis and name entities from text |
WordNet | AUEBNLP1.3way | -2 | -2.67 | Synonyms |
WordNet | BIU1.2way | 2.5 | Synonyms, hyponyms (2 levels away from the original term), hyponym_instance and derivations | |
WordNet | Boeing3.3way | 4 | 5.67 | |
WordNet | DFKI1.3way | -0.17 | 0 | |
WordNet | DFKI2.3way | 0.16 | 0.34 | |
WordNet | DFKI3.3way | 0.17 | 0.17 | |
WordNet | DLSIUAES1.2way | 0.83 | Similarity between lemmata, computed by WordNet-based metrics | |
WordNet | DLSIUAES1.3way | -0.5 | −0.33 | Similarity between lemmata, computed by WordNet-based metrics |
WordNet | JU_CSE_TAC1.2way | 0.34 | WordNet based Unigram match | |
WordNet | PeMoZa1.2way | -0.5 | Derivational Morphology from WordNet | |
WordNet | PeMoZa1.2way | 1.33 | Verb Entailment from Wordnet | |
WordNet | PeMoZa2.2way | 1 | Derivational Morphology from WordNet | |
WordNet | PeMoZa2.2way | -0.33 | Verb Entailment from Wordnet | |
WordNet | QUANTA1.2way | -0.17 | We use several relations from wordnet, such as synonyms, hyponym, hypernym et al. | |
WordNet | Sagan1.3way | 0 | -0.83 | The system is based on machine learning approach. The ablation test was obtained with 2 less features using WordNet in the training and testing steps.
|
WordNet | Siel_093.3way | 0.34 | -0.17 | Similarity between nouns using WN tool |
WordNet | ssl1.3way | 0 | 0.67 | WordNet Analysis |
WordNet | UB.dmirg3.2way | 0 | ||
WordNet | UI_ccg1.2way | 4 | word similarity == identity | |
WordNet + FrameNet |
UB.dmirg3.2way | 0 | ||
WordNet + VerbOcean |
DFKI1.3way | 0 | 0.17 | |
WordNet + VerbOcean |
DFKI2.3way | 0.5 | 0.67 | |
WordNet + VerbOcean |
DFKI3.3way | 0.17 | 0.17 | |
WordNet + VerbOcean |
UAIC20091.3way | 2 | 1.50 | Contradiction identification |
WordNet + VerbOcean + DLSIUAES_negation_list |
DLSIUAES1.2way | 0.66 | Antonym relations between verbs (VO+WN); polarity based on negation terms (short list constructed by ourselves) | |
WordNet + VerbOcean + DLSIUAES_negation_list |
DLSIUAES1.3way | -1 | -0.5 | Antonym relations between verbs (VO+WN); polarity based on negation terms (short list constructed by ourselves) |
WordNet + XWordNet |
UAIC20091.3way | 1 | 1.33 | Synonymy, hyponymy and hypernymy and eXtended WordNet relation |