Character-based Bidirectional LSTM-CRF with words and characters for Japanese Named Entity Recognition

Shotaro Misawa, Motoki Taniguchi, Yasuhide Miura, Tomoko Ohkuma


Abstract
Recently, neural models have shown superior performance over conventional models in NER tasks. These models use CNN to extract sub-word information along with RNN to predict a tag for each word. However, these models have been tested almost entirely on English texts. It remains unclear whether they perform similarly in other languages. We worked on Japanese NER using neural models and discovered two obstacles of the state-of-the-art model. First, CNN is unsuitable for extracting Japanese sub-word information. Secondly, a model predicting a tag for each word cannot extract an entity when a part of a word composes an entity. The contributions of this work are (1) verifying the effectiveness of the state-of-the-art NER model for Japanese, (2) proposing a neural model for predicting a tag for each character using word and character information. Experimentally obtained results demonstrate that our model outperforms the state-of-the-art neural English NER model in Japanese.
Anthology ID:
W17-4114
Volume:
Proceedings of the First Workshop on Subword and Character Level Models in NLP
Month:
September
Year:
2017
Address:
Copenhagen, Denmark
Editors:
Manaal Faruqui, Hinrich Schuetze, Isabel Trancoso, Yadollah Yaghoobzadeh
Venue:
SCLeM
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
97–102
Language:
URL:
https://aclanthology.org/W17-4114
DOI:
10.18653/v1/W17-4114
Bibkey:
Cite (ACL):
Shotaro Misawa, Motoki Taniguchi, Yasuhide Miura, and Tomoko Ohkuma. 2017. Character-based Bidirectional LSTM-CRF with words and characters for Japanese Named Entity Recognition. In Proceedings of the First Workshop on Subword and Character Level Models in NLP, pages 97–102, Copenhagen, Denmark. Association for Computational Linguistics.
Cite (Informal):
Character-based Bidirectional LSTM-CRF with words and characters for Japanese Named Entity Recognition (Misawa et al., SCLeM 2017)
Copy Citation:
PDF:
https://aclanthology.org/W17-4114.pdf
Data
CoNLL 2003