A Multimodal Dialogue System for Medical Decision Support inside Virtual Reality

Alexander Prange, Margarita Chikobava, Peter Poller, Michael Barz, Daniel Sonntag


Abstract
We present a multimodal dialogue system that allows doctors to interact with a medical decision support system in virtual reality (VR). We integrate an interactive visualization of patient records and radiology image data, as well as therapy predictions. Therapy predictions are computed in real-time using a deep learning model.
Anthology ID:
W17-5504
Volume:
Proceedings of the 18th Annual SIGdial Meeting on Discourse and Dialogue
Month:
August
Year:
2017
Address:
Saarbrücken, Germany
Editors:
Kristiina Jokinen, Manfred Stede, David DeVault, Annie Louis
Venue:
SIGDIAL
SIG:
SIGDIAL
Publisher:
Association for Computational Linguistics
Note:
Pages:
23–26
Language:
URL:
https://aclanthology.org/W17-5504
DOI:
10.18653/v1/W17-5504
Bibkey:
Cite (ACL):
Alexander Prange, Margarita Chikobava, Peter Poller, Michael Barz, and Daniel Sonntag. 2017. A Multimodal Dialogue System for Medical Decision Support inside Virtual Reality. In Proceedings of the 18th Annual SIGdial Meeting on Discourse and Dialogue, pages 23–26, Saarbrücken, Germany. Association for Computational Linguistics.
Cite (Informal):
A Multimodal Dialogue System for Medical Decision Support inside Virtual Reality (Prange et al., SIGDIAL 2017)
Copy Citation:
PDF:
https://aclanthology.org/W17-5504.pdf