Improving Slot Filling Performance with Attentive Neural Networks on Dependency Structures

Lifu Huang, Avirup Sil, Heng Ji, Radu Florian


Abstract
Slot Filling (SF) aims to extract the values of certain types of attributes (or slots, such as person:cities_of_residence) for a given entity from a large collection of source documents. In this paper we propose an effective DNN architecture for SF with the following new strategies: (1). Take a regularized dependency graph instead of a raw sentence as input to DNN, to compress the wide contexts between query and candidate filler; (2). Incorporate two attention mechanisms: local attention learned from query and candidate filler, and global attention learned from external knowledge bases, to guide the model to better select indicative contexts to determine slot type. Experiments show that this framework outperforms state-of-the-art on both relation extraction (16% absolute F-score gain) and slot filling validation for each individual system (up to 8.5% absolute F-score gain).
Anthology ID:
D17-1274
Volume:
Proceedings of the 2017 Conference on Empirical Methods in Natural Language Processing
Month:
September
Year:
2017
Address:
Copenhagen, Denmark
Editors:
Martha Palmer, Rebecca Hwa, Sebastian Riedel
Venue:
EMNLP
SIG:
SIGDAT
Publisher:
Association for Computational Linguistics
Note:
Pages:
2588–2597
Language:
URL:
https://aclanthology.org/D17-1274
DOI:
10.18653/v1/D17-1274
Bibkey:
Cite (ACL):
Lifu Huang, Avirup Sil, Heng Ji, and Radu Florian. 2017. Improving Slot Filling Performance with Attentive Neural Networks on Dependency Structures. In Proceedings of the 2017 Conference on Empirical Methods in Natural Language Processing, pages 2588–2597, Copenhagen, Denmark. Association for Computational Linguistics.
Cite (Informal):
Improving Slot Filling Performance with Attentive Neural Networks on Dependency Structures (Huang et al., EMNLP 2017)
Copy Citation:
PDF:
https://aclanthology.org/D17-1274.pdf