Difference between revisions of "RTE6 - Ablation Tests"

From ACL Wiki
Jump to navigation Jump to search
Line 45: Line 45:
 
| style="text-align: center;"| No WordNet
 
| style="text-align: center;"| No WordNet
  
 +
|- bgcolor="#FFFFF" "align="left"
 +
| Name Normalization
 +
| budapestcad2_abl2
 +
| style="text-align: center;"| 0.65
 +
| style="text-align: center;"| no name normalization was performed (e.g. George W. Bush -> Bush).
 +
 +
|- bgcolor="#FFFFF" "align="left"
 +
| Named Entities Recognition
 +
| budapestcad2_abl3
 +
| style="text-align: center;"| -1.23
 +
| style="text-align: center;"| no NER
  
 
|}
 
|}

Revision as of 09:46, 2 February 2011

The following table lists the results of the ablation tests (a mandatory track since the RTE5 campaign), submitted by participants to RTE6 .


Participants are kindly invited to check if all the inserted information is correct and complete.


Ablated Component Ablation Run[1] Resource impact - F1 Resource Usage Description
WordNet BIU1_abl-1 0.9 No Word-Net. On Dev set: 39.18% (compared to 40.73% when WN is used)
CatVar BIU1_abl-2 0.63 No CatVar. On Dev set achieved about 40.20% (compared to 40.73% when CatVar is used)
Coreference resolver BIU1_abl-3 -0.88 No coreference resolver

On Dev set 41.62% (Compared to 40.73% when Coreference resolver is used). This ablation test is an unusual ablation test, since it shows that the co-reference resolution component has a negative impact.

DIRT Boeing1_abl-1 3.97 DIRT removed
WordNet Boeing1_abl-2 4.42 No WordNet
Name Normalization budapestcad2_abl2 0.65 no name normalization was performed (e.g. George W. Bush -> Bush).
Named Entities Recognition budapestcad2_abl3 -1.23 no NER


Footnotes

  1. For further information about participants, click here: RTE Challenges - Data about participants


   Return to RTE Knowledge Resources