Improving Semantic Relevance for Sequence-to-Sequence Learning of Chinese Social Media Text Summarization

Shuming Ma, Xu Sun, Jingjing Xu, Houfeng Wang, Wenjie Li, Qi Su


Abstract
Current Chinese social media text summarization models are based on an encoder-decoder framework. Although its generated summaries are similar to source texts literally, they have low semantic relevance. In this work, our goal is to improve semantic relevance between source texts and summaries for Chinese social media summarization. We introduce a Semantic Relevance Based neural model to encourage high semantic similarity between texts and summaries. In our model, the source text is represented by a gated attention encoder, while the summary representation is produced by a decoder. Besides, the similarity score between the representations is maximized during training. Our experiments show that the proposed model outperforms baseline systems on a social media corpus.
Anthology ID:
P17-2100
Volume:
Proceedings of the 55th Annual Meeting of the Association for Computational Linguistics (Volume 2: Short Papers)
Month:
July
Year:
2017
Address:
Vancouver, Canada
Editors:
Regina Barzilay, Min-Yen Kan
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
635–640
Language:
URL:
https://aclanthology.org/P17-2100
DOI:
10.18653/v1/P17-2100
Bibkey:
Cite (ACL):
Shuming Ma, Xu Sun, Jingjing Xu, Houfeng Wang, Wenjie Li, and Qi Su. 2017. Improving Semantic Relevance for Sequence-to-Sequence Learning of Chinese Social Media Text Summarization. In Proceedings of the 55th Annual Meeting of the Association for Computational Linguistics (Volume 2: Short Papers), pages 635–640, Vancouver, Canada. Association for Computational Linguistics.
Cite (Informal):
Improving Semantic Relevance for Sequence-to-Sequence Learning of Chinese Social Media Text Summarization (Ma et al., ACL 2017)
Copy Citation:
PDF:
https://aclanthology.org/P17-2100.pdf