Learning Word Representations with Cross-Sentence Dependency for End-to-End Co-reference Resolution

Hongyin Luo, Jim Glass


Abstract
In this work, we present a word embedding model that learns cross-sentence dependency for improving end-to-end co-reference resolution (E2E-CR). While the traditional E2E-CR model generates word representations by running long short-term memory (LSTM) recurrent neural networks on each sentence of an input article or conversation separately, we propose linear sentence linking and attentional sentence linking models to learn cross-sentence dependency. Both sentence linking strategies enable the LSTMs to make use of valuable information from context sentences while calculating the representation of the current input word. With this approach, the LSTMs learn word embeddings considering knowledge not only from the current sentence but also from the entire input document. Experiments show that learning cross-sentence dependency enriches information contained by the word representations, and improves the performance of the co-reference resolution model compared with our baseline.
Anthology ID:
D18-1518
Volume:
Proceedings of the 2018 Conference on Empirical Methods in Natural Language Processing
Month:
October-November
Year:
2018
Address:
Brussels, Belgium
Editors:
Ellen Riloff, David Chiang, Julia Hockenmaier, Jun’ichi Tsujii
Venue:
EMNLP
SIG:
SIGDAT
Publisher:
Association for Computational Linguistics
Note:
Pages:
4829–4833
Language:
URL:
https://aclanthology.org/D18-1518
DOI:
10.18653/v1/D18-1518
Bibkey:
Cite (ACL):
Hongyin Luo and Jim Glass. 2018. Learning Word Representations with Cross-Sentence Dependency for End-to-End Co-reference Resolution. In Proceedings of the 2018 Conference on Empirical Methods in Natural Language Processing, pages 4829–4833, Brussels, Belgium. Association for Computational Linguistics.
Cite (Informal):
Learning Word Representations with Cross-Sentence Dependency for End-to-End Co-reference Resolution (Luo & Glass, EMNLP 2018)
Copy Citation:
PDF:
https://aclanthology.org/D18-1518.pdf
Video:
 https://aclanthology.org/D18-1518.mp4
Data
OntoNotes 5.0