Neural Dialogue Context Online End-of-Turn Detection

Ryo Masumura, Tomohiro Tanaka, Atsushi Ando, Ryo Ishii, Ryuichiro Higashinaka, Yushi Aono


Abstract
This paper proposes a fully neural network based dialogue-context online end-of-turn detection method that can utilize long-range interactive information extracted from both speaker’s utterances and collocutor’s utterances. The proposed method combines multiple time-asynchronous long short-term memory recurrent neural networks, which can capture speaker’s and collocutor’s multiple sequential features, and their interactions. On the assumption of applying the proposed method to spoken dialogue systems, we introduce speaker’s acoustic sequential features and collocutor’s linguistic sequential features, each of which can be extracted in an online manner. Our evaluation confirms the effectiveness of taking dialogue context formed by the speaker’s utterances and collocutor’s utterances into consideration.
Anthology ID:
W18-5024
Volume:
Proceedings of the 19th Annual SIGdial Meeting on Discourse and Dialogue
Month:
July
Year:
2018
Address:
Melbourne, Australia
Editors:
Kazunori Komatani, Diane Litman, Kai Yu, Alex Papangelis, Lawrence Cavedon, Mikio Nakano
Venue:
SIGDIAL
SIG:
SIGDIAL
Publisher:
Association for Computational Linguistics
Note:
Pages:
224–228
Language:
URL:
https://aclanthology.org/W18-5024
DOI:
10.18653/v1/W18-5024
Bibkey:
Cite (ACL):
Ryo Masumura, Tomohiro Tanaka, Atsushi Ando, Ryo Ishii, Ryuichiro Higashinaka, and Yushi Aono. 2018. Neural Dialogue Context Online End-of-Turn Detection. In Proceedings of the 19th Annual SIGdial Meeting on Discourse and Dialogue, pages 224–228, Melbourne, Australia. Association for Computational Linguistics.
Cite (Informal):
Neural Dialogue Context Online End-of-Turn Detection (Masumura et al., SIGDIAL 2018)
Copy Citation:
PDF:
https://aclanthology.org/W18-5024.pdf