Leveraging Context Information for Natural Question Generation

Linfeng Song, Zhiguo Wang, Wael Hamza, Yue Zhang, Daniel Gildea


Abstract
The task of natural question generation is to generate a corresponding question given the input passage (fact) and answer. It is useful for enlarging the training set of QA systems. Previous work has adopted sequence-to-sequence models that take a passage with an additional bit to indicate answer position as input. However, they do not explicitly model the information between answer and other context within the passage. We propose a model that matches the answer with the passage before generating the question. Experiments show that our model outperforms the existing state of the art using rich features.
Anthology ID:
N18-2090
Volume:
Proceedings of the 2018 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 2 (Short Papers)
Month:
June
Year:
2018
Address:
New Orleans, Louisiana
Editors:
Marilyn Walker, Heng Ji, Amanda Stent
Venue:
NAACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
569–574
Language:
URL:
https://aclanthology.org/N18-2090
DOI:
10.18653/v1/N18-2090
Bibkey:
Cite (ACL):
Linfeng Song, Zhiguo Wang, Wael Hamza, Yue Zhang, and Daniel Gildea. 2018. Leveraging Context Information for Natural Question Generation. In Proceedings of the 2018 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 2 (Short Papers), pages 569–574, New Orleans, Louisiana. Association for Computational Linguistics.
Cite (Informal):
Leveraging Context Information for Natural Question Generation (Song et al., NAACL 2018)
Copy Citation:
PDF:
https://aclanthology.org/N18-2090.pdf
Code
 freesunshine0316/MPQG
Data
SQuAD