Deep Exhaustive Model for Nested Named Entity Recognition

Mohammad Golam Sohrab, Makoto Miwa


Abstract
We propose a simple deep neural model for nested named entity recognition (NER). Most NER models focused on flat entities and ignored nested entities, which failed to fully capture underlying semantic information in texts. The key idea of our model is to enumerate all possible regions or spans as potential entity mentions and classify them with deep neural networks. To reduce the computational costs and capture the information of the contexts around the regions, the model represents the regions using the outputs of shared underlying bidirectional long short-term memory. We evaluate our exhaustive model on the GENIA and JNLPBA corpora in biomedical domain, and the results show that our model outperforms state-of-the-art models on nested and flat NER, achieving 77.1% and 78.4% respectively in terms of F-score, without any external knowledge resources.
Anthology ID:
D18-1309
Volume:
Proceedings of the 2018 Conference on Empirical Methods in Natural Language Processing
Month:
October-November
Year:
2018
Address:
Brussels, Belgium
Editors:
Ellen Riloff, David Chiang, Julia Hockenmaier, Jun’ichi Tsujii
Venue:
EMNLP
SIG:
SIGDAT
Publisher:
Association for Computational Linguistics
Note:
Pages:
2843–2849
Language:
URL:
https://aclanthology.org/D18-1309
DOI:
10.18653/v1/D18-1309
Bibkey:
Cite (ACL):
Mohammad Golam Sohrab and Makoto Miwa. 2018. Deep Exhaustive Model for Nested Named Entity Recognition. In Proceedings of the 2018 Conference on Empirical Methods in Natural Language Processing, pages 2843–2849, Brussels, Belgium. Association for Computational Linguistics.
Cite (Informal):
Deep Exhaustive Model for Nested Named Entity Recognition (Sohrab & Miwa, EMNLP 2018)
Copy Citation:
PDF:
https://aclanthology.org/D18-1309.pdf
Video:
 https://aclanthology.org/D18-1309.mp4
Data
GENIA