Difference between revisions of "Temporal Information Extraction (State of the art)"
Jump to navigation
Jump to search
Line 1: | Line 1: | ||
+ | ==Data sets== | ||
+ | |||
+ | ==Performance measures== | ||
+ | |||
+ | ==Results== | ||
+ | The following results refers to the TempEval-3 challenge, the last evaluation exercise. | ||
+ | |||
+ | ===Task A: Temporal expression extraction and normalisation=== | ||
+ | |||
+ | {| border="1" cellpadding="5" cellspacing="1" width="100%" | ||
+ | |- | ||
+ | ! rowspan="3" | System name | ||
+ | ! rowspan="3" | Short description | ||
+ | ! rowspan="3" | Main publication | ||
+ | ! colspan="6" | Identification | ||
+ | ! colspan="2" | Normalisation | ||
+ | ! rowspan="3" | Overall score | ||
+ | ! rowspan="3" | Software | ||
+ | ! rowspan="3" | License | ||
+ | |- | ||
+ | ! colspan="3" | Strict matching | ||
+ | ! colspan="3" | Lenient matching | ||
+ | ! colspan="2" | Accuracy | ||
+ | |- | ||
+ | ! Pre. | ||
+ | ! Rec. | ||
+ | ! F1 | ||
+ | ! Pre. | ||
+ | ! Rec. | ||
+ | ! F1 | ||
+ | ! Type | ||
+ | ! Value | ||
+ | |- | ||
+ | | HeidelTime-t | ||
+ | | | ||
+ | | | ||
+ | | | ||
+ | | | ||
+ | | | ||
+ | | | ||
+ | | | ||
+ | | | ||
+ | | | ||
+ | | | ||
+ | | | ||
+ | | | ||
+ | | | ||
+ | |- | ||
+ | | HeidelTime-bf | ||
+ | | | ||
+ | | | ||
+ | | | ||
+ | | | ||
+ | | | ||
+ | | | ||
+ | | | ||
+ | | | ||
+ | | | ||
+ | | | ||
+ | | | ||
+ | | | ||
+ | | | ||
+ | |- | ||
+ | | HeidelTime-1.2 | ||
+ | | | ||
+ | | | ||
+ | | | ||
+ | | | ||
+ | | | ||
+ | | | ||
+ | | | ||
+ | | | ||
+ | | | ||
+ | | | ||
+ | | | ||
+ | | | ||
+ | | | ||
+ | |- | ||
+ | | NavyTime-1,2 | ||
+ | | | ||
+ | | | ||
+ | | | ||
+ | | | ||
+ | | | ||
+ | | | ||
+ | | | ||
+ | | | ||
+ | | | ||
+ | | | ||
+ | | | ||
+ | | | ||
+ | | | ||
+ | |- | ||
+ | | ManTIME-4 | ||
+ | | | ||
+ | | | ||
+ | | | ||
+ | | | ||
+ | | | ||
+ | | | ||
+ | | | ||
+ | | | ||
+ | | | ||
+ | | | ||
+ | | | ||
+ | | | ||
+ | | | ||
+ | |- | ||
+ | | ManTIME-6 | ||
+ | | | ||
+ | | | ||
+ | | | ||
+ | | | ||
+ | | | ||
+ | | | ||
+ | | | ||
+ | | | ||
+ | | | ||
+ | | | ||
+ | | | ||
+ | | | ||
+ | | | ||
+ | |- | ||
+ | | ManTIME-3 | ||
+ | | | ||
+ | | | ||
+ | | | ||
+ | | | ||
+ | | | ||
+ | | | ||
+ | | | ||
+ | | | ||
+ | | | ||
+ | | | ||
+ | | | ||
+ | | | ||
+ | | | ||
+ | |- | ||
+ | | SUTime | ||
+ | | | ||
+ | | | ||
+ | | | ||
+ | | | ||
+ | | | ||
+ | | | ||
+ | | | ||
+ | | | ||
+ | | | ||
+ | | | ||
+ | | | ||
+ | | | ||
+ | | | ||
+ | |- | ||
+ | | ManTIME-1 | ||
+ | | | ||
+ | | | ||
+ | | | ||
+ | | | ||
+ | | | ||
+ | | | ||
+ | | | ||
+ | | | ||
+ | | | ||
+ | | | ||
+ | | | ||
+ | | | ||
+ | | | ||
+ | |- | ||
+ | | ManTIME-5 | ||
+ | | | ||
+ | | | ||
+ | | | ||
+ | | | ||
+ | | | ||
+ | | | ||
+ | | | ||
+ | | | ||
+ | | | ||
+ | | | ||
+ | | | ||
+ | | | ||
+ | | | ||
+ | |- | ||
+ | | ManTIME-2 | ||
+ | | | ||
+ | | | ||
+ | | | ||
+ | | | ||
+ | | | ||
+ | | | ||
+ | | | ||
+ | | | ||
+ | | | ||
+ | | | ||
+ | | | ||
+ | | | ||
+ | | | ||
+ | |- | ||
+ | | ATT-2 | ||
+ | | | ||
+ | | | ||
+ | | | ||
+ | | | ||
+ | | | ||
+ | | | ||
+ | | | ||
+ | | | ||
+ | | | ||
+ | | | ||
+ | | | ||
+ | | | ||
+ | | | ||
+ | |- | ||
+ | | ATT-1 | ||
+ | | | ||
+ | | | ||
+ | | | ||
+ | | | ||
+ | | | ||
+ | | | ||
+ | | | ||
+ | | | ||
+ | | | ||
+ | | | ||
+ | | | ||
+ | | | ||
+ | | | ||
+ | |- | ||
+ | | ClearTK-1,2 | ||
+ | | | ||
+ | | | ||
+ | | | ||
+ | | | ||
+ | | | ||
+ | | | ||
+ | | | ||
+ | | | ||
+ | | | ||
+ | | | ||
+ | | | ||
+ | | | ||
+ | | | ||
+ | |- | ||
+ | | JU-CSE | ||
+ | | | ||
+ | | | ||
+ | | | ||
+ | | | ||
+ | | | ||
+ | | | ||
+ | | | ||
+ | | | ||
+ | | | ||
+ | | | ||
+ | | | ||
+ | | | ||
+ | | | ||
+ | |- | ||
+ | | KUL | ||
+ | | | ||
+ | | | ||
+ | | | ||
+ | | | ||
+ | | | ||
+ | | | ||
+ | | | ||
+ | | | ||
+ | | | ||
+ | | | ||
+ | | | ||
+ | | | ||
+ | | | ||
+ | |- | ||
+ | | KUL-TE3RunABC | ||
+ | | | ||
+ | | | ||
+ | | | ||
+ | | | ||
+ | | | ||
+ | | | ||
+ | | | ||
+ | | | ||
+ | | | ||
+ | | | ||
+ | | | ||
+ | | | ||
+ | | | ||
+ | |- | ||
+ | | ClearTK-3,4 | ||
+ | | | ||
+ | | | ||
+ | | | ||
+ | | | ||
+ | | | ||
+ | | | ||
+ | | | ||
+ | | | ||
+ | | | ||
+ | | | ||
+ | | | ||
+ | | | ||
+ | | | ||
+ | |- | ||
+ | | ATT-3 | ||
+ | | | ||
+ | | | ||
+ | | | ||
+ | | | ||
+ | | | ||
+ | | | ||
+ | | | ||
+ | | | ||
+ | | | ||
+ | | | ||
+ | | | ||
+ | | | ||
+ | | | ||
+ | |- | ||
+ | | FSS-TimEx | ||
+ | | | ||
+ | | | ||
+ | | | ||
+ | | | ||
+ | | | ||
+ | | | ||
+ | | | ||
+ | | | ||
+ | | | ||
+ | | | ||
+ | | | ||
+ | | | ||
+ | | | ||
+ | |- | ||
+ | |} | ||
+ | |||
+ | ===Task B: Event extraction and classification=== | ||
+ | |||
+ | ===Task C: Annotating relations given gold entities=== | ||
+ | |||
+ | ===Task C relation only: Annotating relations given gold entities and related pairs=== | ||
+ | |||
+ | ==Challenges== | ||
+ | * '''TempEval''', ''Temporal Relation Identification'', 2007: [http://www.timeml.org/tempeval/ web page] | ||
+ | * '''TempEval-2''', ''Evaluating Events, Time Expressions, and Temporal Relations'', 2010: [http://www.timeml.org/tempeval2/ web page] | ||
+ | * '''TempEval-3''', ''Evaluating Time Expressions, Events, and Temporal Relations'', 2013: [http://www.cs.york.ac.uk/semeval-2013/task1/ web page] | ||
+ | |||
+ | ==References== | ||
+ | |||
+ | * UzZaman, N., Llorens, H., Derczynski, L., Allen, J., Verhagen, M., and Pustejovsky, J. [http://www.aclweb.org/anthology/S/S13/S13-2001.pdf Semeval-2013 task 1: Tempeval-3: Evaluating time expressions, events, and temporal relations]. In Second Joint Conference on Lexical and Computational Semantics (*SEM), Volume 2: Proceedings of the Seventh International Workshop on Semantic Evaluation (SemEval 2013) (Atlanta, Georgia, USA, June 2013), Association for Computational Linguistics, pp. 1–9. | ||
+ | * Bethard, S. [http://www.aclweb.org/anthology/S/S13/S13-2002.pdf ClearTK-TimeML: A minimalist approach to tempeval 2013]. In Second Joint Conference on Lexical and Computational Semantics (*SEM), Volume 2: Proceedings of the Seventh International Workshop on Semantic Evaluation (SemEval 2013) (Atlanta, Georgia, USA, June 2013), vol. 2, Association for Computational Linguistics, Association for Computational Linguistics, pp. 10–14. | ||
+ | * Stro ̈tgen, J., Zell, J., and Gertz, M. [http://www.aclweb.org/anthology/S/S13/S13-2003.pdf Heideltime: Tuning english and developing spanish resources for tempeval-3]. In Second Joint Conference on Lexical and Computational Semantics (*SEM), Volume 2: Proceedings of the Seventh International Workshop on Semantic Evaluation (SemEval 2013) (Atlanta, Georgia, USA, June 2013), Association for Computational Linguistics, pp. 15–19. | ||
+ | * Jung, H., and Stent, A. [http://www.aclweb.org/anthology/S/S13/S13-2004.pdf ATT1: Temporal annotation using big windows and rich syntactic and semantic features]. In Second Joint Conference on Lexical and Computational Semantics (*SEM), Volume 2: Proceedings of the Seventh International Workshop on Semantic Evaluation (SemEval 2013) (Atlanta, Georgia, USA, June 2013), Association for Computational Linguistics, pp. 20–24. | ||
+ | * Filannino, M., Brown, G., and Nenadic, G. [http://www.aclweb.org/anthology/S/S13/S13-2009.pdf ManTIME: Temporal expression identification and normalization in the Tempeval-3 challenge]. In Second Joint Conference on Lexical and Computational Semantics (*SEM), Volume 2: Proceedings of the Seventh International Workshop on Semantic Evalu- ation (SemEval 2013) (Atlanta, Georgia, USA, June 2013), Association for Computational Linguistics, pp. 53–57. | ||
+ | * Zavarella, V., and Tanev, H. [http://www.aclweb.org/anthology/S/S13/S13-2010.pdf FSS-TimEx for tempeval-3: Extracting temporal information from text]. In Second Joint Conference on Lexical and Computational Semantics (*SEM), Volume 2: Proceedings of the Seventh International Workshop on Semantic Evaluation (SemEval 2013) (Atlanta, Georgia, USA, June 2013), Association for Computational Linguistics, pp. 58–63. | ||
+ | * Kolya, A. K., Kundu, A., Gupta, R., Ekbal, A., and Bandyopadhyay, S. Ju cse: A crf based approach to annotation of temporal expression, event and temporal relations. In Second Joint Conference on Lexical and Computational Semantics (*SEM), Volume 2: Proceedings of the Seventh International Workshop on Semantic Evaluation (SemEval 2013) (Atlanta, Georgia, USA, June 2013), Association for Computational Linguistics, pp. 64–72. | ||
+ | * Chambers, N. Navytime: Event and time ordering from raw text. In Second Joint Conference on Lexical and Computational Semantics (*SEM), Volume 2: Proceedings of the Seventh International Workshop on Semantic Evaluation (SemEval 2013) (Atlanta, Georgia, USA, June 2013), Association for Computational Linguistics, pp. 73–77. | ||
+ | * Chang, A., and Manning, C. D. Sutime: Evaluation in tempeval-3. In Second Joint Conference on Lexical and Computational Semantics (*SEM), Volume 2: Proceedings of the Seventh International Workshop on Semantic Evaluation (SemEval 2013) (Atlanta, Georgia, USA, June 2013), Association for Computational Linguistics, pp. 78–82. | ||
+ | * Kolomiyets, O., and Moens, M.-F. Kul: Data-driven approach to temporal parsing of newswire ar- ticles. In Second Joint Conference on Lexical and Computational Semantics (*SEM), Volume 2: Proceed- ings of the Seventh International Workshop on Semantic Evaluation (SemEval 2013) (Atlanta, Georgia, USA, June 2013), Association for Computational Linguistics, pp. 83–87. | ||
+ | * Laokulrat, N., Miwa, M., Tsuruoka, Y., and Chikayama, T. Uttime: Temporal relation clas- sification using deep syntactic features. In Second Joint Conference on Lexical and Computational Se- mantics (*SEM), Volume 2: Proceedings of the Seventh International Workshop on Semantic Evaluation (SemEval 2013) (Atlanta, Georgia, USA, June 2013), Association for Computational Linguistics, pp. 88– 92. | ||
+ | * | ||
+ | * | ||
+ | * | ||
+ | * | ||
+ | * | ||
+ | * | ||
+ | * | ||
+ | * | ||
+ | |||
+ | |||
== See also == | == See also == | ||
− | |||
* [[State of the art]] | * [[State of the art]] | ||
− | |||
== External links == | == External links == | ||
− | |||
* [http://timexportal.info TimexPortal] | * [http://timexportal.info TimexPortal] | ||
[[Category:State of the art]] | [[Category:State of the art]] |
Revision as of 01:58, 11 June 2013
Data sets
Performance measures
Results
The following results refers to the TempEval-3 challenge, the last evaluation exercise.
Task A: Temporal expression extraction and normalisation
System name | Short description | Main publication | Identification | Normalisation | Overall score | Software | License | ||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|---|
Strict matching | Lenient matching | Accuracy | |||||||||||
Pre. | Rec. | F1 | Pre. | Rec. | F1 | Type | Value | ||||||
HeidelTime-t | |||||||||||||
HeidelTime-bf | |||||||||||||
HeidelTime-1.2 | |||||||||||||
NavyTime-1,2 | |||||||||||||
ManTIME-4 | |||||||||||||
ManTIME-6 | |||||||||||||
ManTIME-3 | |||||||||||||
SUTime | |||||||||||||
ManTIME-1 | |||||||||||||
ManTIME-5 | |||||||||||||
ManTIME-2 | |||||||||||||
ATT-2 | |||||||||||||
ATT-1 | |||||||||||||
ClearTK-1,2 | |||||||||||||
JU-CSE | |||||||||||||
KUL | |||||||||||||
KUL-TE3RunABC | |||||||||||||
ClearTK-3,4 | |||||||||||||
ATT-3 | |||||||||||||
FSS-TimEx |
Task B: Event extraction and classification
Task C: Annotating relations given gold entities
Challenges
- TempEval, Temporal Relation Identification, 2007: web page
- TempEval-2, Evaluating Events, Time Expressions, and Temporal Relations, 2010: web page
- TempEval-3, Evaluating Time Expressions, Events, and Temporal Relations, 2013: web page
References
- UzZaman, N., Llorens, H., Derczynski, L., Allen, J., Verhagen, M., and Pustejovsky, J. Semeval-2013 task 1: Tempeval-3: Evaluating time expressions, events, and temporal relations. In Second Joint Conference on Lexical and Computational Semantics (*SEM), Volume 2: Proceedings of the Seventh International Workshop on Semantic Evaluation (SemEval 2013) (Atlanta, Georgia, USA, June 2013), Association for Computational Linguistics, pp. 1–9.
- Bethard, S. ClearTK-TimeML: A minimalist approach to tempeval 2013. In Second Joint Conference on Lexical and Computational Semantics (*SEM), Volume 2: Proceedings of the Seventh International Workshop on Semantic Evaluation (SemEval 2013) (Atlanta, Georgia, USA, June 2013), vol. 2, Association for Computational Linguistics, Association for Computational Linguistics, pp. 10–14.
- Stro ̈tgen, J., Zell, J., and Gertz, M. Heideltime: Tuning english and developing spanish resources for tempeval-3. In Second Joint Conference on Lexical and Computational Semantics (*SEM), Volume 2: Proceedings of the Seventh International Workshop on Semantic Evaluation (SemEval 2013) (Atlanta, Georgia, USA, June 2013), Association for Computational Linguistics, pp. 15–19.
- Jung, H., and Stent, A. ATT1: Temporal annotation using big windows and rich syntactic and semantic features. In Second Joint Conference on Lexical and Computational Semantics (*SEM), Volume 2: Proceedings of the Seventh International Workshop on Semantic Evaluation (SemEval 2013) (Atlanta, Georgia, USA, June 2013), Association for Computational Linguistics, pp. 20–24.
- Filannino, M., Brown, G., and Nenadic, G. ManTIME: Temporal expression identification and normalization in the Tempeval-3 challenge. In Second Joint Conference on Lexical and Computational Semantics (*SEM), Volume 2: Proceedings of the Seventh International Workshop on Semantic Evalu- ation (SemEval 2013) (Atlanta, Georgia, USA, June 2013), Association for Computational Linguistics, pp. 53–57.
- Zavarella, V., and Tanev, H. FSS-TimEx for tempeval-3: Extracting temporal information from text. In Second Joint Conference on Lexical and Computational Semantics (*SEM), Volume 2: Proceedings of the Seventh International Workshop on Semantic Evaluation (SemEval 2013) (Atlanta, Georgia, USA, June 2013), Association for Computational Linguistics, pp. 58–63.
- Kolya, A. K., Kundu, A., Gupta, R., Ekbal, A., and Bandyopadhyay, S. Ju cse: A crf based approach to annotation of temporal expression, event and temporal relations. In Second Joint Conference on Lexical and Computational Semantics (*SEM), Volume 2: Proceedings of the Seventh International Workshop on Semantic Evaluation (SemEval 2013) (Atlanta, Georgia, USA, June 2013), Association for Computational Linguistics, pp. 64–72.
- Chambers, N. Navytime: Event and time ordering from raw text. In Second Joint Conference on Lexical and Computational Semantics (*SEM), Volume 2: Proceedings of the Seventh International Workshop on Semantic Evaluation (SemEval 2013) (Atlanta, Georgia, USA, June 2013), Association for Computational Linguistics, pp. 73–77.
- Chang, A., and Manning, C. D. Sutime: Evaluation in tempeval-3. In Second Joint Conference on Lexical and Computational Semantics (*SEM), Volume 2: Proceedings of the Seventh International Workshop on Semantic Evaluation (SemEval 2013) (Atlanta, Georgia, USA, June 2013), Association for Computational Linguistics, pp. 78–82.
- Kolomiyets, O., and Moens, M.-F. Kul: Data-driven approach to temporal parsing of newswire ar- ticles. In Second Joint Conference on Lexical and Computational Semantics (*SEM), Volume 2: Proceed- ings of the Seventh International Workshop on Semantic Evaluation (SemEval 2013) (Atlanta, Georgia, USA, June 2013), Association for Computational Linguistics, pp. 83–87.
- Laokulrat, N., Miwa, M., Tsuruoka, Y., and Chikayama, T. Uttime: Temporal relation clas- sification using deep syntactic features. In Second Joint Conference on Lexical and Computational Se- mantics (*SEM), Volume 2: Proceedings of the Seventh International Workshop on Semantic Evaluation (SemEval 2013) (Atlanta, Georgia, USA, June 2013), Association for Computational Linguistics, pp. 88– 92.