Neural Relation Extraction with Multi-lingual Attention

Yankai Lin, Zhiyuan Liu, Maosong Sun


Abstract
Relation extraction has been widely used for finding unknown relational facts from plain text. Most existing methods focus on exploiting mono-lingual data for relation extraction, ignoring massive information from the texts in various languages. To address this issue, we introduce a multi-lingual neural relation extraction framework, which employs mono-lingual attention to utilize the information within mono-lingual texts and further proposes cross-lingual attention to consider the information consistency and complementarity among cross-lingual texts. Experimental results on real-world datasets show that, our model can take advantage of multi-lingual texts and consistently achieve significant improvements on relation extraction as compared with baselines.
Anthology ID:
P17-1004
Volume:
Proceedings of the 55th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers)
Month:
July
Year:
2017
Address:
Vancouver, Canada
Editors:
Regina Barzilay, Min-Yen Kan
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
34–43
Language:
URL:
https://aclanthology.org/P17-1004
DOI:
10.18653/v1/P17-1004
Bibkey:
Cite (ACL):
Yankai Lin, Zhiyuan Liu, and Maosong Sun. 2017. Neural Relation Extraction with Multi-lingual Attention. In Proceedings of the 55th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pages 34–43, Vancouver, Canada. Association for Computational Linguistics.
Cite (Informal):
Neural Relation Extraction with Multi-lingual Attention (Lin et al., ACL 2017)
Copy Citation:
PDF:
https://aclanthology.org/P17-1004.pdf
Video:
 https://aclanthology.org/P17-1004.mp4