Difference between revisions of "RTE5 - Ablation Tests"

From ACL Wiki
Jump to navigation Jump to search
m
m
Line 26: Line 26:
 
| BIU1.2way
 
| BIU1.2way
 
| style="text-align: center;"| 1.33
 
| style="text-align: center;"| 1.33
| style="text-align: center;"|  
+
| style="text-align: center;"| —
 
| Inference rules
 
| Inference rules
  
Line 47: Line 47:
 
| DLSIUAES1.2way
 
| DLSIUAES1.2way
 
| style="text-align: center;"| 1.16
 
| style="text-align: center;"| 1.16
| style="text-align: center;"|  
+
| style="text-align: center;"| —
 
| frame-to-frame similarity metric
 
| frame-to-frame similarity metric
  
Line 61: Line 61:
 
| UB.dmirg3.2way
 
| UB.dmirg3.2way
 
| style="text-align: center;"| 0
 
| style="text-align: center;"| 0
| style="text-align: center;"|  
+
| style="text-align: center;"| —
 
|  
 
|  
  
Line 68: Line 68:
 
| VensesTeam2.2way
 
| VensesTeam2.2way
 
| style="text-align: center;"| 2.83
 
| style="text-align: center;"| 2.83
| style="text-align: center;"|  
+
| style="text-align: center;"| —
 
| Semantic fields are used as semantic similarity matching, in all cases of non identical lemmas
 
| Semantic fields are used as semantic similarity matching, in all cases of non identical lemmas
  
Line 89: Line 89:
 
| UI_ccg1.2way
 
| UI_ccg1.2way
 
| style="text-align: center;"| 4.83
 
| style="text-align: center;"| 4.83
| style="text-align: center;"|  
+
| style="text-align: center;"| —
 
| Named Entity recognition/comparison
 
| Named Entity recognition/comparison
  
Line 103: Line 103:
 
| QUANTA1.2way
 
| QUANTA1.2way
 
| style="text-align: center;"| 0.67
 
| style="text-align: center;"| 0.67
| style="text-align: center;"|  
+
| style="text-align: center;"| —
 
| We use Named Entity similarity as a feature
 
| We use Named Entity similarity as a feature
  
Line 117: Line 117:
 
| PeMoZa3.2way
 
| PeMoZa3.2way
 
| style="text-align: center;"| 0
 
| style="text-align: center;"| 0
| style="text-align: center;"|  
+
| style="text-align: center;"| —
 
|  
 
|  
  
Line 124: Line 124:
 
| PeMoZa3.2way
 
| PeMoZa3.2way
 
| style="text-align: center;"| 0
 
| style="text-align: center;"| 0
| style="text-align: center;"|  
+
| style="text-align: center;"| —
 
|  
 
|  
  
Line 131: Line 131:
 
| PeMoZa3.2way
 
| PeMoZa3.2way
 
| style="text-align: center;"| 0.66
 
| style="text-align: center;"| 0.66
| style="text-align: center;"|  
+
| style="text-align: center;"| —
 
|  
 
|  
  
Line 138: Line 138:
 
| PeMoZa3.2way
 
| PeMoZa3.2way
 
| style="text-align: center;"| 0
 
| style="text-align: center;"| 0
| style="text-align: center;"|  
+
| style="text-align: center;"| —
 
|  
 
|  
  
Line 173: Line 173:
 
| QUANTA1.2way
 
| QUANTA1.2way
 
| style="text-align: center;"| 0
 
| style="text-align: center;"| 0
| style="text-align: center;"|  
+
| style="text-align: center;"| —
 
| We use "opposite-of" relation in VerbOcean as a feature
 
| We use "opposite-of" relation in VerbOcean as a feature
  
Line 187: Line 187:
 
| BIU1.2way
 
| BIU1.2way
 
| style="text-align: center;"| -1
 
| style="text-align: center;"| -1
| style="text-align: center;"|  
+
| style="text-align: center;"| —
 
| Lexical rules extracted from Wikipedia definition sentences, title parenthesis, redirect and hyperlink relations
 
| Lexical rules extracted from Wikipedia definition sentences, title parenthesis, redirect and hyperlink relations
  
Line 201: Line 201:
 
| FBKirst1.2way
 
| FBKirst1.2way
 
| style="text-align: center;"| 1
 
| style="text-align: center;"| 1
| style="text-align: center;"|  
+
| style="text-align: center;"| —
 
| Rules extracted from WP using Latent Semantic Analysis (LSA)
 
| Rules extracted from WP using Latent Semantic Analysis (LSA)
  
Line 229: Line 229:
 
| BIU1.2way
 
| BIU1.2way
 
| style="text-align: center;"| 2.5
 
| style="text-align: center;"| 2.5
| style="text-align: center;"|  
+
| style="text-align: center;"| —
 
| Synonyms, hyponyms (2 levels away from the original term), hyponym_instance and derivations
 
| Synonyms, hyponyms (2 levels away from the original term), hyponym_instance and derivations
  
Line 264: Line 264:
 
| DLSIUAES1.2way
 
| DLSIUAES1.2way
 
| style="text-align: center;"| 0.83
 
| style="text-align: center;"| 0.83
| style="text-align: center;"|  
+
| style="text-align: center;"| —
 
| Similarity between lemmata, computed by WordNet-based metrics
 
| Similarity between lemmata, computed by WordNet-based metrics
  
Line 278: Line 278:
 
| JU_CSE_TAC1.2way
 
| JU_CSE_TAC1.2way
 
| style="text-align: center;"| 0.34
 
| style="text-align: center;"| 0.34
| style="text-align: center;"|  
+
| style="text-align: center;"| —
 
| WordNet based Unigram match
 
| WordNet based Unigram match
  
Line 285: Line 285:
 
| PeMoZa1.2way
 
| PeMoZa1.2way
 
| style="text-align: center;"| -0.5
 
| style="text-align: center;"| -0.5
| style="text-align: center;"|  
+
| style="text-align: center;"| —
 
| Derivational Morphology from WordNet
 
| Derivational Morphology from WordNet
  
Line 292: Line 292:
 
| PeMoZa1.2way
 
| PeMoZa1.2way
 
| style="text-align: center;"| 1.33
 
| style="text-align: center;"| 1.33
| style="text-align: center;"|  
+
| style="text-align: center;"| —
 
| Verb Entailment from Wordnet
 
| Verb Entailment from Wordnet
  
Line 299: Line 299:
 
| PeMoZa2.2way
 
| PeMoZa2.2way
 
| style="text-align: center;"| 1
 
| style="text-align: center;"| 1
| style="text-align: center;"|  
+
| style="text-align: center;"| —
 
| Derivational Morphology from WordNet
 
| Derivational Morphology from WordNet
  
Line 306: Line 306:
 
| PeMoZa2.2way
 
| PeMoZa2.2way
 
| style="text-align: center;"| -0.33
 
| style="text-align: center;"| -0.33
| style="text-align: center;"|  
+
| style="text-align: center;"| —
 
| Verb Entailment from Wordnet
 
| Verb Entailment from Wordnet
  
Line 313: Line 313:
 
| QUANTA1.2way
 
| QUANTA1.2way
 
| style="text-align: center;"| -0.17
 
| style="text-align: center;"| -0.17
| style="text-align: center;"|  
+
| style="text-align: center;"| —
 
| We use several relations from wordnet, such as synonyms, hyponym, hypernym et al.
 
| We use several relations from wordnet, such as synonyms, hyponym, hypernym et al.
  
Line 342: Line 342:
 
| UB.dmirg3.2way
 
| UB.dmirg3.2way
 
| style="text-align: center;"| 0
 
| style="text-align: center;"| 0
| style="text-align: center;"|  
+
| style="text-align: center;"| —
 
|  
 
|  
  
Line 349: Line 349:
 
| UI_ccg1.2way
 
| UI_ccg1.2way
 
| style="text-align: center;"| 4  
 
| style="text-align: center;"| 4  
| style="text-align: center;"|  
+
| style="text-align: center;"| —
 
| word similarity == identity
 
| word similarity == identity
  
Line 356: Line 356:
 
| UB.dmirg3.2way
 
| UB.dmirg3.2way
 
| style="text-align: center;"| 0
 
| style="text-align: center;"| 0
| style="text-align: center;"|  
+
| style="text-align: center;"| —
 
|  
 
|  
  
Line 391: Line 391:
 
| DLSIUAES1.2way
 
| DLSIUAES1.2way
 
| style="text-align: center;"| 0.66
 
| style="text-align: center;"| 0.66
| style="text-align: center;"|  
+
| style="text-align: center;"| —
 
| Antonym relations between verbs (VO+WN); polarity based on negation terms (short list constructed by ourselves)
 
| Antonym relations between verbs (VO+WN); polarity based on negation terms (short list constructed by ourselves)
  

Revision as of 08:40, 30 November 2009

Ablated Resource Team Run Δ Accuracy % - 2way Δ Accuracy % - 3way Resource Usage Description
Acronym guide Siel_093.3way 0 0 Acronym Resolution
Acronym guide +
UAIC_Acronym_rules
UAIC20091.3way 0.17 0.16 We start from acronym-guide, but additional we use a rule that consider for expressions like Xaaaa Ybbbb Zcccc the acronym XYZ, regardless of length of text with this form.
DIRT BIU1.2way 1.33 Inference rules
DIRT Boeing3.3way -1.17 0
DIRT UAIC20091.3way 0.17 0.33 We transform text and hypothesis with MINIPAR into dependency trees: use of DIRT relations to map verbs in T with verbs in H
Framenet DLSIUAES1.2way 1.16 frame-to-frame similarity metric
Framenet DLSIUAES1.3way -0.17 -0.17 frame-to-frame similarity metric
Framenet UB.dmirg3.2way 0
Grady Ward’s MOBY Thesaurus +
Roget's Thesaurus
VensesTeam2.2way 2.83 Semantic fields are used as semantic similarity matching, in all cases of non identical lemmas
MontyLingua Tool Siel_093.3way 0 0 For the VerbOcean, the verbs have to be in the base form. We used the "MontyLingua" tool to convert the verbs into their base form
NEGATION_rules by UAIC UAIC20091.3way 0 -1.34 Negation rules check in the dependency trees on verbs descending branches to see if some categories of words that change the meaning are found.
NER UI_ccg1.2way 4.83 Named Entity recognition/comparison
PropBank cswhu1.3way 2 3.17 syntactic and semantic parsing
Stanford NER QUANTA1.2way 0.67 We use Named Entity similarity as a feature
Stopword list FBKirst1.2way 1.5 -10.28
Training data from RTE1, 2, 3 PeMoZa3.2way 0
Training data from RTE1, 2, 3 PeMoZa3.2way 0
Training data from RTE2 PeMoZa3.2way 0.66
Training data from RTE2, 3 PeMoZa3.2way 0
VerbOcean DFKI1.3way 0 0.17
VerbOcean DFKI2.3way 0.33 0.5
VerbOcean DFKI3.3way 0.17 0.17
VerbOcean FBKirst1.2way -0.16 -10.28 Rules extracted from VerbOcean
VerbOcean QUANTA1.2way 0 We use "opposite-of" relation in VerbOcean as a feature
VerbOcean Siel_093.3way 0 0 Similarity/anthonymy/unrelatedness between verbs
WikiPedia BIU1.2way -1 Lexical rules extracted from Wikipedia definition sentences, title parenthesis, redirect and hyperlink relations
WikiPedia cswhu1.3way 1.33 3.34 Lexical semantic rules
WikiPedia FBKirst1.2way 1 Rules extracted from WP using Latent Semantic Analysis (LSA)
WikiPedia UAIC20091.3way 1.17 1.5 Relations between named entities
Wikipedia +
NER's (LingPipe, GATE) +
Perl patterns
UAIC20091.3way 6.17 5 NE module: NERs, in order to identify Persons, Locations, Jobs, Languages, etc; Perl patterns built by us for RTE4 in order to identify numbers and dates; our own resources extracted from Wikipedia in order to identify a "distance" between one name entity from hypothesis and name entities from text
WordNet AUEBNLP1.3way -2 -2.67 Synonyms
WordNet BIU1.2way 2.5 Synonyms, hyponyms (2 levels away from the original term), hyponym_instance and derivations
WordNet Boeing3.3way 4 5.67
WordNet DFKI1.3way -0.17 0
WordNet DFKI2.3way 0.16 0.34
WordNet DFKI3.3way 0.17 0.17
WordNet DLSIUAES1.2way 0.83 Similarity between lemmata, computed by WordNet-based metrics
WordNet DLSIUAES1.3way -0.5 -0.33 Similarity between lemmata, computed by WordNet-based metrics
WordNet JU_CSE_TAC1.2way 0.34 WordNet based Unigram match
WordNet PeMoZa1.2way -0.5 Derivational Morphology from WordNet
WordNet PeMoZa1.2way 1.33 Verb Entailment from Wordnet
WordNet PeMoZa2.2way 1 Derivational Morphology from WordNet
WordNet PeMoZa2.2way -0.33 Verb Entailment from Wordnet
WordNet QUANTA1.2way -0.17 We use several relations from wordnet, such as synonyms, hyponym, hypernym et al.
WordNet Sagan1.3way 0 -0.83 The system is based on machine learning approach. The ablation test was obtained with 2 less features using WordNet in the training and testing steps.


WordNet Siel_093.3way 0.34 -0.17 Similarity between nouns using WN tool
WordNet ssl1.3way 0 0.67 WordNet Analysis
WordNet UB.dmirg3.2way 0
WordNet UI_ccg1.2way 4 word similarity == identity
WordNet +
FrameNet
UB.dmirg3.2way 0
WordNet +
VerbOcean
DFKI1.3way 0 0.17
WordNet +
VerbOcean
DFKI2.3way 0.5 0.67
WordNet +
VerbOcean
DFKI3.3way 0.17 0.17
WordNet +
VerbOcean
UAIC20091.3way 2 1.50 Contradiction identification
WordNet +
VerbOcean +
DLSIUAES_negation_list
DLSIUAES1.2way 0.66 Antonym relations between verbs (VO+WN); polarity based on negation terms (short list constructed by ourselves)
WordNet +
VerbOcean +
DLSIUAES_negation_list
DLSIUAES1.3way -1 -0.5 Antonym relations between verbs (VO+WN); polarity based on negation terms (short list constructed by ourselves)
WordNet +
XWordNet
UAIC20091.3way 1 1.33 Synonymy, hyponymy and hypernymy and eXtended WordNet relation