Closed-Book Training to Improve Summarization Encoder Memory

Yichen Jiang, Mohit Bansal


Abstract
A good neural sequence-to-sequence summarization model should have a strong encoder that can distill and memorize the important information from long input texts so that the decoder can generate salient summaries based on the encoder’s memory. In this paper, we aim to improve the memorization capabilities of the encoder of a pointer-generator model by adding an additional ‘closed-book’ decoder without attention and pointer mechanisms. Such a decoder forces the encoder to be more selective in the information encoded in its memory state because the decoder can’t rely on the extra information provided by the attention and possibly copy modules, and hence improves the entire model. On the CNN/Daily Mail dataset, our 2-decoder model outperforms the baseline significantly in terms of ROUGE and METEOR metrics, for both cross-entropy and reinforced setups (and on human evaluation). Moreover, our model also achieves higher scores in a test-only DUC-2002 generalizability setup. We further present a memory ability test, two saliency metrics, as well as several sanity-check ablations (based on fixed-encoder, gradient-flow cut, and model capacity) to prove that the encoder of our 2-decoder model does in fact learn stronger memory representations than the baseline encoder.
Anthology ID:
D18-1440
Volume:
Proceedings of the 2018 Conference on Empirical Methods in Natural Language Processing
Month:
October-November
Year:
2018
Address:
Brussels, Belgium
Editors:
Ellen Riloff, David Chiang, Julia Hockenmaier, Jun’ichi Tsujii
Venue:
EMNLP
SIG:
SIGDAT
Publisher:
Association for Computational Linguistics
Note:
Pages:
4067–4077
Language:
URL:
https://aclanthology.org/D18-1440
DOI:
10.18653/v1/D18-1440
Bibkey:
Cite (ACL):
Yichen Jiang and Mohit Bansal. 2018. Closed-Book Training to Improve Summarization Encoder Memory. In Proceedings of the 2018 Conference on Empirical Methods in Natural Language Processing, pages 4067–4077, Brussels, Belgium. Association for Computational Linguistics.
Cite (Informal):
Closed-Book Training to Improve Summarization Encoder Memory (Jiang & Bansal, EMNLP 2018)
Copy Citation:
PDF:
https://aclanthology.org/D18-1440.pdf
Attachment:
 D18-1440.Attachment.pdf
Data
CNN/Daily Mail