Put It Back: Entity Typing with Language Model Enhancement

Ji Xin, Hao Zhu, Xu Han, Zhiyuan Liu, Maosong Sun


Abstract
Entity typing aims to classify semantic types of an entity mention in a specific context. Most existing models obtain training data using distant supervision, and inevitably suffer from the problem of noisy labels. To address this issue, we propose entity typing with language model enhancement. It utilizes a language model to measure the compatibility between context sentences and labels, and thereby automatically focuses more on context-dependent labels. Experiments on benchmark datasets demonstrate that our method is capable of enhancing the entity typing model with information from the language model, and significantly outperforms the state-of-the-art baseline. Code and data for this paper can be found from https://github.com/thunlp/LME.
Anthology ID:
D18-1121
Volume:
Proceedings of the 2018 Conference on Empirical Methods in Natural Language Processing
Month:
October-November
Year:
2018
Address:
Brussels, Belgium
Editors:
Ellen Riloff, David Chiang, Julia Hockenmaier, Jun’ichi Tsujii
Venue:
EMNLP
SIG:
SIGDAT
Publisher:
Association for Computational Linguistics
Note:
Pages:
993–998
Language:
URL:
https://aclanthology.org/D18-1121
DOI:
10.18653/v1/D18-1121
Bibkey:
Cite (ACL):
Ji Xin, Hao Zhu, Xu Han, Zhiyuan Liu, and Maosong Sun. 2018. Put It Back: Entity Typing with Language Model Enhancement. In Proceedings of the 2018 Conference on Empirical Methods in Natural Language Processing, pages 993–998, Brussels, Belgium. Association for Computational Linguistics.
Cite (Informal):
Put It Back: Entity Typing with Language Model Enhancement (Xin et al., EMNLP 2018)
Copy Citation:
PDF:
https://aclanthology.org/D18-1121.pdf
Code
 thunlp/LME