Deep Learning for Semantic Composition

Xiaodan Zhu, Edward Grefenstette


Abstract
Learning representation to model the meaning of text has been a core problem in NLP. The last several years have seen extensive interests on distributional approaches, in which text spans of different granularities are encoded as vectors of numerical values. If properly learned, such representation has showed to achieve the state-of-the-art performance on a wide range of NLP problems.In this tutorial, we will cover the fundamentals and the state-of-the-art research on neural network-based modeling for semantic composition, which aims to learn distributed representation for different granularities of text, e.g., phrases, sentences, or even documents, from their sub-component meaning representation, e.g., word embedding.
Anthology ID:
P17-5003
Volume:
Proceedings of the 55th Annual Meeting of the Association for Computational Linguistics: Tutorial Abstracts
Month:
July
Year:
2017
Address:
Vancouver, Canada
Editors:
Maja Popović, Jordan Boyd-Graber
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
6–7
Language:
URL:
https://aclanthology.org/P17-5003
DOI:
Bibkey:
Cite (ACL):
Xiaodan Zhu and Edward Grefenstette. 2017. Deep Learning for Semantic Composition. In Proceedings of the 55th Annual Meeting of the Association for Computational Linguistics: Tutorial Abstracts, pages 6–7, Vancouver, Canada. Association for Computational Linguistics.
Cite (Informal):
Deep Learning for Semantic Composition (Zhu & Grefenstette, ACL 2017)
Copy Citation:
PDF:
https://aclanthology.org/P17-5003.pdf
Video:
 https://vimeo.com/234950059