Neural Text Generation in Stories Using Entity Representations as Context

Elizabeth Clark, Yangfeng Ji, Noah A. Smith


Abstract
We introduce an approach to neural text generation that explicitly represents entities mentioned in the text. Entity representations are vectors that are updated as the text proceeds; they are designed specifically for narrative text like fiction or news stories. Our experiments demonstrate that modeling entities offers a benefit in two automatic evaluations: mention generation (in which a model chooses which entity to mention next and which words to use in the mention) and selection between a correct next sentence and a distractor from later in the same story. We also conduct a human evaluation on automatically generated text in story contexts; this study supports our emphasis on entities and suggests directions for further research.
Anthology ID:
N18-1204
Volume:
Proceedings of the 2018 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long Papers)
Month:
June
Year:
2018
Address:
New Orleans, Louisiana
Editors:
Marilyn Walker, Heng Ji, Amanda Stent
Venue:
NAACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
2250–2260
Language:
URL:
https://aclanthology.org/N18-1204
DOI:
10.18653/v1/N18-1204
Bibkey:
Cite (ACL):
Elizabeth Clark, Yangfeng Ji, and Noah A. Smith. 2018. Neural Text Generation in Stories Using Entity Representations as Context. In Proceedings of the 2018 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long Papers), pages 2250–2260, New Orleans, Louisiana. Association for Computational Linguistics.
Cite (Informal):
Neural Text Generation in Stories Using Entity Representations as Context (Clark et al., NAACL 2018)
Copy Citation:
PDF:
https://aclanthology.org/N18-1204.pdf
Video:
 https://aclanthology.org/N18-1204.mp4