S2SPMN: A Simple and Effective Framework for Response Generation with Relevant Information

Jiaxin Pei, Chenliang Li


Abstract
How to generate relevant and informative responses is one of the core topics in response generation area. Following the task formulation of machine translation, previous works mainly consider response generation task as a mapping from a source sentence to a target sentence. To realize this mapping, existing works tend to design intuitive but complex models. However, the relevant information existed in large dialogue corpus is mainly overlooked. In this paper, we propose Sequence to Sequence with Prototype Memory Network (S2SPMN) to exploit the relevant information provided by the large dialogue corpus to enhance response generation. Specifically, we devise two simple approaches in S2SPMN to select the relevant information (named prototypes) from the dialogue corpus. These prototypes are then saved into prototype memory network (PMN). Furthermore, a hierarchical attention mechanism is devised to extract the semantic information from the PMN to assist the response generation process. Empirical studies reveal the advantage of our model over several classical and strong baselines.
Anthology ID:
D18-1082
Volume:
Proceedings of the 2018 Conference on Empirical Methods in Natural Language Processing
Month:
October-November
Year:
2018
Address:
Brussels, Belgium
Editors:
Ellen Riloff, David Chiang, Julia Hockenmaier, Jun’ichi Tsujii
Venue:
EMNLP
SIG:
SIGDAT
Publisher:
Association for Computational Linguistics
Note:
Pages:
745–750
Language:
URL:
https://aclanthology.org/D18-1082
DOI:
10.18653/v1/D18-1082
Bibkey:
Cite (ACL):
Jiaxin Pei and Chenliang Li. 2018. S2SPMN: A Simple and Effective Framework for Response Generation with Relevant Information. In Proceedings of the 2018 Conference on Empirical Methods in Natural Language Processing, pages 745–750, Brussels, Belgium. Association for Computational Linguistics.
Cite (Informal):
S2SPMN: A Simple and Effective Framework for Response Generation with Relevant Information (Pei & Li, EMNLP 2018)
Copy Citation:
PDF:
https://aclanthology.org/D18-1082.pdf