Difference between revisions of "RTE5 - Ablation Tests"
Jump to navigation
Jump to search
Line 11: | Line 11: | ||
| Acronym guide | | Acronym guide | ||
| Siel_093.3way | | Siel_093.3way | ||
− | | style="text-align: | + | | style="text-align: center;"| 0 |
− | | style="text-align: | + | | style="text-align: center;"| 0 |
| Acronym Resolution | | Acronym Resolution | ||
Line 18: | Line 18: | ||
| Acronym guide + <br>UAIC_Acronym_rules | | Acronym guide + <br>UAIC_Acronym_rules | ||
| UAIC20091.3way | | UAIC20091.3way | ||
− | | style="text-align: | + | | style="text-align: center;"| 0.0017 |
− | | style="text-align: | + | | style="text-align: center;"| 0.0016 |
| We start from acronym-guide, but additional we use a rule that consider for expressions like Xaaaa Ybbbb Zcccc the acronym XYZ, regardless of length of text with this form. | | We start from acronym-guide, but additional we use a rule that consider for expressions like Xaaaa Ybbbb Zcccc the acronym XYZ, regardless of length of text with this form. | ||
Line 25: | Line 25: | ||
| DIRT | | DIRT | ||
| BIU1.2way | | BIU1.2way | ||
− | | style="text-align: | + | | style="text-align: center;"| 0.0133 |
− | | style="text-align: | + | | style="text-align: center;"| |
| Inference rules | | Inference rules | ||
Line 32: | Line 32: | ||
| DIRT | | DIRT | ||
| Boeing3.3way | | Boeing3.3way | ||
− | | style="text-align: | + | | style="text-align: center;"| -0.0117 |
− | | style="text-align: | + | | style="text-align: center;"| 0 |
| | | | ||
Line 39: | Line 39: | ||
| DIRT | | DIRT | ||
| UAIC20091.3way | | UAIC20091.3way | ||
− | | style="text-align: | + | | style="text-align: center;"| 0.0017 |
− | | style="text-align: | + | | style="text-align: center;"| 0.0033 |
| We transform text and hypothesis with MINIPAR into dependency trees: use of DIRT relations to map verbs in T with verbs in H | | We transform text and hypothesis with MINIPAR into dependency trees: use of DIRT relations to map verbs in T with verbs in H | ||
Line 46: | Line 46: | ||
| Framenet | | Framenet | ||
| DLSIUAES1.2way | | DLSIUAES1.2way | ||
− | | style="text-align: | + | | style="text-align: center;"| 0.0116 |
− | | style="text-align: | + | | style="text-align: center;"| |
| frame-to-frame similarity metric | | frame-to-frame similarity metric | ||
Line 53: | Line 53: | ||
| Framenet | | Framenet | ||
| DLSIUAES1.3way | | DLSIUAES1.3way | ||
− | | style="text-align: | + | | style="text-align: center;"| -0.0017 |
− | | style="text-align: | + | | style="text-align: center;"| -0.0017 |
| frame-to-frame similarity metric | | frame-to-frame similarity metric | ||
Line 60: | Line 60: | ||
| Framenet | | Framenet | ||
| UB.dmirg3.2way | | UB.dmirg3.2way | ||
− | | style="text-align: | + | | style="text-align: center;"| 0 |
− | | style="text-align: | + | | style="text-align: center;"| |
| | | | ||
Line 67: | Line 67: | ||
| Grady Ward’s MOBY Thesaurus + <br>Roget's Thesaurus | | Grady Ward’s MOBY Thesaurus + <br>Roget's Thesaurus | ||
| VensesTeam2.2way | | VensesTeam2.2way | ||
− | | style="text-align: | + | | style="text-align: center;"| 0.0283 |
− | | style="text-align: | + | | style="text-align: center;"| |
| Semantic fields are used as semantic similarity matching, in all cases of non identical lemmas | | Semantic fields are used as semantic similarity matching, in all cases of non identical lemmas | ||
Line 74: | Line 74: | ||
| MontyLingua Tool | | MontyLingua Tool | ||
| Siel_093.3way | | Siel_093.3way | ||
− | | style="text-align: | + | | style="text-align: center;"| 0 |
− | | style="text-align: | + | | style="text-align: center;"| 0 |
| For the VerbOcean, the verbs have to be in the base form. We used the "MontyLingua" tool to convert the verbs into their base form | | For the VerbOcean, the verbs have to be in the base form. We used the "MontyLingua" tool to convert the verbs into their base form | ||
Line 81: | Line 81: | ||
| NEGATION_rules by UAIC | | NEGATION_rules by UAIC | ||
| UAIC20091.3way | | UAIC20091.3way | ||
− | | style="text-align: | + | | style="text-align: center;"| 0 |
− | | style="text-align: | + | | style="text-align: center;"| -0.0134 |
| Negation rules check in the dependency trees on verbs descending branches to see if some categories of words that change the meaning are found. | | Negation rules check in the dependency trees on verbs descending branches to see if some categories of words that change the meaning are found. | ||
Line 88: | Line 88: | ||
| NER | | NER | ||
| UI_ccg1.2way | | UI_ccg1.2way | ||
− | | style="text-align: | + | | style="text-align: center;"| 0.0483 |
− | | style="text-align: | + | | style="text-align: center;"| |
| Named Entity recognition/comparison | | Named Entity recognition/comparison | ||
Line 95: | Line 95: | ||
| PropBank | | PropBank | ||
| cswhu1.3way | | cswhu1.3way | ||
− | | style="text-align: | + | | style="text-align: center;"| 0.0200 |
− | | style="text-align: | + | | style="text-align: center;"| 0.0317 |
| syntactic and semantic parsing | | syntactic and semantic parsing | ||
Line 102: | Line 102: | ||
| Stanford NER | | Stanford NER | ||
| QUANTA1.2way | | QUANTA1.2way | ||
− | | style="text-align: | + | | style="text-align: center;"| 0.0067 |
− | | style="text-align: | + | | style="text-align: center;"| |
| We use Named Entity similarity as a feature | | We use Named Entity similarity as a feature | ||
Line 109: | Line 109: | ||
| Stopword list | | Stopword list | ||
| FBKirst1.2way | | FBKirst1.2way | ||
− | | style="text-align: | + | | style="text-align: center;"| 0.0150 |
− | | style="text-align: | + | | style="text-align: center;"| -0.1028 |
| | | | ||
Line 116: | Line 116: | ||
| Training data from RTE1, 2, 3 | | Training data from RTE1, 2, 3 | ||
| PeMoZa3.2way | | PeMoZa3.2way | ||
− | | style="text-align: | + | | style="text-align: center;"| 0 |
− | | style="text-align: | + | | style="text-align: center;"| |
| | | | ||
Line 123: | Line 123: | ||
| Training data from RTE1, 2, 3 | | Training data from RTE1, 2, 3 | ||
| PeMoZa3.2way | | PeMoZa3.2way | ||
− | | style="text-align: | + | | style="text-align: center;"| 0 |
− | | style="text-align: | + | | style="text-align: center;"| |
| | | | ||
Line 130: | Line 130: | ||
| Training data from RTE2 | | Training data from RTE2 | ||
| PeMoZa3.2way | | PeMoZa3.2way | ||
− | | style="text-align: | + | | style="text-align: center;"| 0.0066 |
− | | style="text-align: | + | | style="text-align: center;"| |
| | | | ||
Line 137: | Line 137: | ||
| Training data from RTE2, 3 | | Training data from RTE2, 3 | ||
| PeMoZa3.2way | | PeMoZa3.2way | ||
− | | style="text-align: | + | | style="text-align: center;"| 0 |
− | | style="text-align: | + | | style="text-align: center;"| |
| | | | ||
Line 144: | Line 144: | ||
| VerbOcean | | VerbOcean | ||
| DFKI1.3way | | DFKI1.3way | ||
− | | style="text-align: | + | | style="text-align: center;"| 0 |
− | | style="text-align: | + | | style="text-align: center;"| 0.0017 |
| | | | ||
Line 151: | Line 151: | ||
| VerbOcean | | VerbOcean | ||
| DFKI2.3way | | DFKI2.3way | ||
− | | style="text-align: | + | | style="text-align: center;"| 0.0033 |
− | | style="text-align: | + | | style="text-align: center;"| 0.0050 |
| | | | ||
Line 158: | Line 158: | ||
| VerbOcean | | VerbOcean | ||
| DFKI3.3way | | DFKI3.3way | ||
− | | style="text-align: | + | | style="text-align: center;"| 0.0017 |
− | | style="text-align: | + | | style="text-align: center;"| 0.0017 |
| | | | ||
Line 165: | Line 165: | ||
| VerbOcean | | VerbOcean | ||
| FBKirst1.2way | | FBKirst1.2way | ||
− | | style="text-align: | + | | style="text-align: center;"| -0.0016 |
− | | style="text-align: | + | | style="text-align: center;"| -0.1028 |
| Rules extracted from VerbOcean | | Rules extracted from VerbOcean | ||
Line 172: | Line 172: | ||
| VerbOcean | | VerbOcean | ||
| QUANTA1.2way | | QUANTA1.2way | ||
− | | style="text-align: | + | | style="text-align: center;"| 0 |
− | | style="text-align: | + | | style="text-align: center;"| |
| We use "opposite-of" relation in VerbOcean as a feature | | We use "opposite-of" relation in VerbOcean as a feature | ||
Line 179: | Line 179: | ||
| VerbOcean | | VerbOcean | ||
| Siel_093.3way | | Siel_093.3way | ||
− | | style="text-align: | + | | style="text-align: center;"| 0 |
− | | style="text-align: | + | | style="text-align: center;"| 0 |
| Similarity/anthonymy/unrelatedness between verbs | | Similarity/anthonymy/unrelatedness between verbs | ||
Line 186: | Line 186: | ||
| WikiPedia | | WikiPedia | ||
| BIU1.2way | | BIU1.2way | ||
− | | style="text-align: | + | | style="text-align: center;"| -0.0100 |
− | | style="text-align: | + | | style="text-align: center;"| |
| Lexical rules extracted from Wikipedia definition sentences, title parenthesis, redirect and hyperlink relations | | Lexical rules extracted from Wikipedia definition sentences, title parenthesis, redirect and hyperlink relations | ||
Line 193: | Line 193: | ||
| WikiPedia | | WikiPedia | ||
| cswhu1.3way | | cswhu1.3way | ||
− | | style="text-align: | + | | style="text-align: center;"| 0.0133 |
− | | style="text-align: | + | | style="text-align: center;"| 0.0334 |
| Lexical semantic rules | | Lexical semantic rules | ||
Line 200: | Line 200: | ||
| WikiPedia | | WikiPedia | ||
| FBKirst1.2way | | FBKirst1.2way | ||
− | | style="text-align: | + | | style="text-align: center;"| 0.0100 |
− | | style="text-align: | + | | style="text-align: center;"| |
| Rules extracted from WP using Latent Semantic Analysis (LSA) | | Rules extracted from WP using Latent Semantic Analysis (LSA) | ||
Line 207: | Line 207: | ||
| WikiPedia | | WikiPedia | ||
| UAIC20091.3way | | UAIC20091.3way | ||
− | | style="text-align: | + | | style="text-align: center;"| 0.0117 |
− | | style="text-align: | + | | style="text-align: center;"| 0.0150 |
| Relations between named entities | | Relations between named entities | ||
Line 214: | Line 214: | ||
| Wikipedia + <br>NER's (LingPipe, GATE) + <br>Perl patterns | | Wikipedia + <br>NER's (LingPipe, GATE) + <br>Perl patterns | ||
| UAIC20091.3way | | UAIC20091.3way | ||
− | | style="text-align: | + | | style="text-align: center;"| 0.0617 |
− | | style="text-align: | + | | style="text-align: center;"| 0.0500 |
| NE module: NERs, in order to identify Persons, Locations, Jobs, Languages, etc; Perl patterns built by us for RTE4 in order to identify numbers and dates; our own resources extracted from Wikipedia in order to identify a "distance" between one name entity from hypothesis and name entities from text | | NE module: NERs, in order to identify Persons, Locations, Jobs, Languages, etc; Perl patterns built by us for RTE4 in order to identify numbers and dates; our own resources extracted from Wikipedia in order to identify a "distance" between one name entity from hypothesis and name entities from text | ||
Line 221: | Line 221: | ||
| WordNet | | WordNet | ||
| AUEBNLP1.3way | | AUEBNLP1.3way | ||
− | | style="text-align: | + | | style="text-align: center;"| -0.0200 |
− | | style="text-align: | + | | style="text-align: center;"| -0.0267 |
| Synonyms | | Synonyms | ||
Line 228: | Line 228: | ||
| WordNet | | WordNet | ||
| BIU1.2way | | BIU1.2way | ||
− | | style="text-align: | + | | style="text-align: center;"| 0.0250 |
− | | style="text-align: | + | | style="text-align: center;"| |
| Synonyms, hyponyms (2 levels away from the original term), hyponym_instance and derivations | | Synonyms, hyponyms (2 levels away from the original term), hyponym_instance and derivations | ||
Line 235: | Line 235: | ||
| WordNet | | WordNet | ||
| Boeing3.3way | | Boeing3.3way | ||
− | | style="text-align: | + | | style="text-align: center;"| 0.0400 |
− | | style="text-align: | + | | style="text-align: center;"| 0.0567 |
| | | | ||
Line 242: | Line 242: | ||
| WordNet | | WordNet | ||
| DFKI1.3way | | DFKI1.3way | ||
− | | style="text-align: | + | | style="text-align: center;"| -0.0017 |
− | | style="text-align: | + | | style="text-align: center;"| 0 |
| | | | ||
Line 249: | Line 249: | ||
| WordNet | | WordNet | ||
| DFKI2.3way | | DFKI2.3way | ||
− | | style="text-align: | + | | style="text-align: center;"| 0.0016 |
− | | style="text-align: | + | | style="text-align: center;"| 0.0034 |
| | | | ||
Line 256: | Line 256: | ||
| WordNet | | WordNet | ||
| DFKI3.3way | | DFKI3.3way | ||
− | | style="text-align: | + | | style="text-align: center;"| 0.0017 |
− | | style="text-align: | + | | style="text-align: center;"| 0.0017 |
| | | | ||
Line 263: | Line 263: | ||
| WordNet | | WordNet | ||
| DLSIUAES1.2way | | DLSIUAES1.2way | ||
− | | style="text-align: | + | | style="text-align: center;"| 0.0083 |
− | | style="text-align: | + | | style="text-align: center;"| |
| Similarity between lemmata, computed by WordNet-based metrics | | Similarity between lemmata, computed by WordNet-based metrics | ||
Line 270: | Line 270: | ||
| WordNet | | WordNet | ||
| DLSIUAES1.3way | | DLSIUAES1.3way | ||
− | | style="text-align: | + | | style="text-align: center;"| -0.0050 |
− | | style="text-align: | + | | style="text-align: center;"| -0.0033 |
| Similarity between lemmata, computed by WordNet-based metrics | | Similarity between lemmata, computed by WordNet-based metrics | ||
Line 277: | Line 277: | ||
| WordNet | | WordNet | ||
| JU_CSE_TAC1.2way | | JU_CSE_TAC1.2way | ||
− | | style="text-align: | + | | style="text-align: center;"| 0.0034 |
− | | style="text-align: | + | | style="text-align: center;"| |
| WordNet based Unigram match | | WordNet based Unigram match | ||
Line 284: | Line 284: | ||
| WordNet | | WordNet | ||
| PeMoZa1.2way | | PeMoZa1.2way | ||
− | | style="text-align: | + | | style="text-align: center;"| -0.0050 |
− | | style="text-align: | + | | style="text-align: center;"| |
| Derivational Morphology from WordNet | | Derivational Morphology from WordNet | ||
Line 291: | Line 291: | ||
| WordNet | | WordNet | ||
| PeMoZa1.2way | | PeMoZa1.2way | ||
− | | style="text-align: | + | | style="text-align: center;"| 0.0133 |
− | | style="text-align: | + | | style="text-align: center;"| |
| Verb Entailment from Wordnet | | Verb Entailment from Wordnet | ||
Line 298: | Line 298: | ||
| WordNet | | WordNet | ||
| PeMoZa2.2way | | PeMoZa2.2way | ||
− | | style="text-align: | + | | style="text-align: center;"| 0.0100 |
− | | style="text-align: | + | | style="text-align: center;"| |
| Derivational Morphology from WordNet | | Derivational Morphology from WordNet | ||
Line 305: | Line 305: | ||
| WordNet | | WordNet | ||
| PeMoZa2.2way | | PeMoZa2.2way | ||
− | | style="text-align: | + | | style="text-align: center;"| -0.0033 |
− | | style="text-align: | + | | style="text-align: center;"| |
| Verb Entailment from Wordnet | | Verb Entailment from Wordnet | ||
Line 312: | Line 312: | ||
| WordNet | | WordNet | ||
| QUANTA1.2way | | QUANTA1.2way | ||
− | | style="text-align: | + | | style="text-align: center;"| -0.0017 |
− | | style="text-align: | + | | style="text-align: center;"| |
| We use several relations from wordnet, such as synonyms, hyponym, hypernym et al. | | We use several relations from wordnet, such as synonyms, hyponym, hypernym et al. | ||
Line 319: | Line 319: | ||
| WordNet | | WordNet | ||
| Sagan1.3way | | Sagan1.3way | ||
− | | style="text-align: | + | | style="text-align: center;"| 0 |
− | | style="text-align: | + | | style="text-align: center;"| -0.0083 |
| The system is based on machine learning approach. The ablation test was obtained with 2 less features using WordNet in the training and testing steps. | | The system is based on machine learning approach. The ablation test was obtained with 2 less features using WordNet in the training and testing steps. | ||
Line 327: | Line 327: | ||
| WordNet | | WordNet | ||
| Siel_093.3way | | Siel_093.3way | ||
− | | style="text-align: | + | | style="text-align: center;"| 0.0034 |
− | | style="text-align: | + | | style="text-align: center;"| -0.0017 |
| Similarity between nouns using WN tool | | Similarity between nouns using WN tool | ||
Line 334: | Line 334: | ||
| WordNet | | WordNet | ||
| ssl1.3way | | ssl1.3way | ||
− | | style="text-align: | + | | style="text-align: center;"| 0 |
− | | style="text-align: | + | | style="text-align: center;"| 0.0067 |
| WordNet Analysis | | WordNet Analysis | ||
|- bgcolor="#ECECEC" "align="left" | |- bgcolor="#ECECEC" "align="left" | ||
| WordNet | | WordNet | ||
− | | | + | | UB.dmirg3.2way |
− | | style="text-align: | + | | style="text-align: center;"| 0 |
− | | style="text-align: | + | | style="text-align: center;"| |
| | | | ||
|- bgcolor="#ECECEC" "align="left" | |- bgcolor="#ECECEC" "align="left" | ||
| WordNet | | WordNet | ||
+ | | UI_ccg1.2way | ||
+ | | style="text-align: center;"| 0.0400 | ||
+ | | style="text-align: center;"| | ||
+ | | word similarity == identity | ||
+ | |||
+ | |- bgcolor="#ECECEC" "align="left" | ||
+ | | WordNet +<br>FrameNet | ||
+ | | UB.dmirg3.2way | ||
+ | | style="text-align: center;"| 0 | ||
+ | | style="text-align: center;"| | ||
| | | | ||
− | | style="text-align: | + | |
− | | style="text-align: | + | |- bgcolor="#ECECEC" "align="left" |
+ | | WordNet +<br>VerbOcean | ||
+ | | DFKI1.3way | ||
+ | | style="text-align: center;"| 0 | ||
+ | | style="text-align: center;"| 0.0017 | ||
| | | | ||
|- bgcolor="#ECECEC" "align="left" | |- bgcolor="#ECECEC" "align="left" | ||
− | | WordNet | + | | WordNet +<br>VerbOcean |
+ | | DFKI2.3way | ||
+ | | style="text-align: center;"| 0.0050 | ||
+ | | style="text-align: center;"| 0.0067 | ||
| | | | ||
− | | style="text-align: | + | |
− | | style="text-align: | + | |- bgcolor="#ECECEC" "align="left" |
+ | | WordNet +<br>VerbOcean | ||
+ | | DFKI3.3way | ||
+ | | style="text-align: center;"| 0.0017 | ||
+ | | style="text-align: center;"| 0.0017 | ||
| | | | ||
|- bgcolor="#ECECEC" "align="left" | |- bgcolor="#ECECEC" "align="left" | ||
− | | WordNet | + | | WordNet +<br>VerbOcean |
− | | | + | | UAIC20091.3way |
− | | style="text-align: | + | | style="text-align: center;"| 0.0200 |
− | | style="text-align: | + | | style="text-align: center;"| 0.0150 |
− | | | + | | Contradiction identification |
+ | |||
+ | |- bgcolor="#ECECEC" "align="left" | ||
+ | | WordNet +<br>VerbOcean + <br>DLSIUAES_negation_list | ||
+ | | DLSIUAES1.2way | ||
+ | | style="text-align: center;"| 0.0066 | ||
+ | | style="text-align: center;"| | ||
+ | | Antonym relations between verbs (VO+WN); polarity based on negation terms (short list constructed by ourselves) | ||
+ | |||
+ | |- bgcolor="#ECECEC" "align="left" | ||
+ | | WordNet +<br>VerbOcean + <br>DLSIUAES_negation_list | ||
+ | | DLSIUAES1.3way | ||
+ | | style="text-align: center;"| -0.0100 | ||
+ | | style="text-align: center;"| -0.0050 | ||
+ | | Antonym relations between verbs (VO+WN); polarity based on negation terms (short list constructed by ourselves) | ||
|- bgcolor="#ECECEC" "align="left" | |- bgcolor="#ECECEC" "align="left" | ||
− | | WordNet | + | | WordNet +<br>XWordNet |
− | | | + | | UAIC20091.3way |
− | | style="text-align: | + | | style="text-align: center;"| 0.0100 |
− | | style="text-align: | + | | style="text-align: center;"| 0.0133 |
− | | | + | | Synonymy, hyponymy and hypernymy and eXtended WordNet relation |
|} | |} |
Revision as of 11:30, 25 November 2009
Ablated Resource | Team Run | Relative accuracy - 2way | Relative accuracy - 3way | Resource Usage Description |
---|---|---|---|---|
Acronym guide | Siel_093.3way | 0 | 0 | Acronym Resolution |
Acronym guide + UAIC_Acronym_rules |
UAIC20091.3way | 0.0017 | 0.0016 | We start from acronym-guide, but additional we use a rule that consider for expressions like Xaaaa Ybbbb Zcccc the acronym XYZ, regardless of length of text with this form. |
DIRT | BIU1.2way | 0.0133 | Inference rules | |
DIRT | Boeing3.3way | -0.0117 | 0 | |
DIRT | UAIC20091.3way | 0.0017 | 0.0033 | We transform text and hypothesis with MINIPAR into dependency trees: use of DIRT relations to map verbs in T with verbs in H |
Framenet | DLSIUAES1.2way | 0.0116 | frame-to-frame similarity metric | |
Framenet | DLSIUAES1.3way | -0.0017 | -0.0017 | frame-to-frame similarity metric |
Framenet | UB.dmirg3.2way | 0 | ||
Grady Ward’s MOBY Thesaurus + Roget's Thesaurus |
VensesTeam2.2way | 0.0283 | Semantic fields are used as semantic similarity matching, in all cases of non identical lemmas | |
MontyLingua Tool | Siel_093.3way | 0 | 0 | For the VerbOcean, the verbs have to be in the base form. We used the "MontyLingua" tool to convert the verbs into their base form |
NEGATION_rules by UAIC | UAIC20091.3way | 0 | -0.0134 | Negation rules check in the dependency trees on verbs descending branches to see if some categories of words that change the meaning are found. |
NER | UI_ccg1.2way | 0.0483 | Named Entity recognition/comparison | |
PropBank | cswhu1.3way | 0.0200 | 0.0317 | syntactic and semantic parsing |
Stanford NER | QUANTA1.2way | 0.0067 | We use Named Entity similarity as a feature | |
Stopword list | FBKirst1.2way | 0.0150 | -0.1028 | |
Training data from RTE1, 2, 3 | PeMoZa3.2way | 0 | ||
Training data from RTE1, 2, 3 | PeMoZa3.2way | 0 | ||
Training data from RTE2 | PeMoZa3.2way | 0.0066 | ||
Training data from RTE2, 3 | PeMoZa3.2way | 0 | ||
VerbOcean | DFKI1.3way | 0 | 0.0017 | |
VerbOcean | DFKI2.3way | 0.0033 | 0.0050 | |
VerbOcean | DFKI3.3way | 0.0017 | 0.0017 | |
VerbOcean | FBKirst1.2way | -0.0016 | -0.1028 | Rules extracted from VerbOcean |
VerbOcean | QUANTA1.2way | 0 | We use "opposite-of" relation in VerbOcean as a feature | |
VerbOcean | Siel_093.3way | 0 | 0 | Similarity/anthonymy/unrelatedness between verbs |
WikiPedia | BIU1.2way | -0.0100 | Lexical rules extracted from Wikipedia definition sentences, title parenthesis, redirect and hyperlink relations | |
WikiPedia | cswhu1.3way | 0.0133 | 0.0334 | Lexical semantic rules |
WikiPedia | FBKirst1.2way | 0.0100 | Rules extracted from WP using Latent Semantic Analysis (LSA) | |
WikiPedia | UAIC20091.3way | 0.0117 | 0.0150 | Relations between named entities |
Wikipedia + NER's (LingPipe, GATE) + Perl patterns |
UAIC20091.3way | 0.0617 | 0.0500 | NE module: NERs, in order to identify Persons, Locations, Jobs, Languages, etc; Perl patterns built by us for RTE4 in order to identify numbers and dates; our own resources extracted from Wikipedia in order to identify a "distance" between one name entity from hypothesis and name entities from text |
WordNet | AUEBNLP1.3way | -0.0200 | -0.0267 | Synonyms |
WordNet | BIU1.2way | 0.0250 | Synonyms, hyponyms (2 levels away from the original term), hyponym_instance and derivations | |
WordNet | Boeing3.3way | 0.0400 | 0.0567 | |
WordNet | DFKI1.3way | -0.0017 | 0 | |
WordNet | DFKI2.3way | 0.0016 | 0.0034 | |
WordNet | DFKI3.3way | 0.0017 | 0.0017 | |
WordNet | DLSIUAES1.2way | 0.0083 | Similarity between lemmata, computed by WordNet-based metrics | |
WordNet | DLSIUAES1.3way | -0.0050 | -0.0033 | Similarity between lemmata, computed by WordNet-based metrics |
WordNet | JU_CSE_TAC1.2way | 0.0034 | WordNet based Unigram match | |
WordNet | PeMoZa1.2way | -0.0050 | Derivational Morphology from WordNet | |
WordNet | PeMoZa1.2way | 0.0133 | Verb Entailment from Wordnet | |
WordNet | PeMoZa2.2way | 0.0100 | Derivational Morphology from WordNet | |
WordNet | PeMoZa2.2way | -0.0033 | Verb Entailment from Wordnet | |
WordNet | QUANTA1.2way | -0.0017 | We use several relations from wordnet, such as synonyms, hyponym, hypernym et al. | |
WordNet | Sagan1.3way | 0 | -0.0083 | The system is based on machine learning approach. The ablation test was obtained with 2 less features using WordNet in the training and testing steps.
|
WordNet | Siel_093.3way | 0.0034 | -0.0017 | Similarity between nouns using WN tool |
WordNet | ssl1.3way | 0 | 0.0067 | WordNet Analysis |
WordNet | UB.dmirg3.2way | 0 | ||
WordNet | UI_ccg1.2way | 0.0400 | word similarity == identity | |
WordNet + FrameNet |
UB.dmirg3.2way | 0 | ||
WordNet + VerbOcean |
DFKI1.3way | 0 | 0.0017 | |
WordNet + VerbOcean |
DFKI2.3way | 0.0050 | 0.0067 | |
WordNet + VerbOcean |
DFKI3.3way | 0.0017 | 0.0017 | |
WordNet + VerbOcean |
UAIC20091.3way | 0.0200 | 0.0150 | Contradiction identification |
WordNet + VerbOcean + DLSIUAES_negation_list |
DLSIUAES1.2way | 0.0066 | Antonym relations between verbs (VO+WN); polarity based on negation terms (short list constructed by ourselves) | |
WordNet + VerbOcean + DLSIUAES_negation_list |
DLSIUAES1.3way | -0.0100 | -0.0050 | Antonym relations between verbs (VO+WN); polarity based on negation terms (short list constructed by ourselves) |
WordNet + XWordNet |
UAIC20091.3way | 0.0100 | 0.0133 | Synonymy, hyponymy and hypernymy and eXtended WordNet relation |