Improving Neural Abstractive Document Summarization with Structural Regularization

Wei Li, Xinyan Xiao, Yajuan Lyu, Yuanzhuo Wang


Abstract
Recent neural sequence-to-sequence models have shown significant progress on short text summarization. However, for document summarization, they fail to capture the long-term structure of both documents and multi-sentence summaries, resulting in information loss and repetitions. In this paper, we propose to leverage the structural information of both documents and multi-sentence summaries to improve the document summarization performance. Specifically, we import both structural-compression and structural-coverage regularization into the summarization process in order to capture the information compression and information coverage properties, which are the two most important structural properties of document summarization. Experimental results demonstrate that the structural regularization improves the document summarization performance significantly, which enables our model to generate more informative and concise summaries, and thus significantly outperforms state-of-the-art neural abstractive methods.
Anthology ID:
D18-1441
Volume:
Proceedings of the 2018 Conference on Empirical Methods in Natural Language Processing
Month:
October-November
Year:
2018
Address:
Brussels, Belgium
Editors:
Ellen Riloff, David Chiang, Julia Hockenmaier, Jun’ichi Tsujii
Venue:
EMNLP
SIG:
SIGDAT
Publisher:
Association for Computational Linguistics
Note:
Pages:
4078–4087
Language:
URL:
https://aclanthology.org/D18-1441
DOI:
10.18653/v1/D18-1441
Bibkey:
Cite (ACL):
Wei Li, Xinyan Xiao, Yajuan Lyu, and Yuanzhuo Wang. 2018. Improving Neural Abstractive Document Summarization with Structural Regularization. In Proceedings of the 2018 Conference on Empirical Methods in Natural Language Processing, pages 4078–4087, Brussels, Belgium. Association for Computational Linguistics.
Cite (Informal):
Improving Neural Abstractive Document Summarization with Structural Regularization (Li et al., EMNLP 2018)
Copy Citation:
PDF:
https://aclanthology.org/D18-1441.pdf
Data
CNN/Daily Mail