Tensor Product Generation Networks for Deep NLP Modeling

Qiuyuan Huang, Paul Smolensky, Xiaodong He, Li Deng, Dapeng Wu


Abstract
We present a new approach to the design of deep networks for natural language processing (NLP), based on the general technique of Tensor Product Representations (TPRs) for encoding and processing symbol structures in distributed neural networks. A network architecture — the Tensor Product Generation Network (TPGN) — is proposed which is capable in principle of carrying out TPR computation, but which uses unconstrained deep learning to design its internal representations. Instantiated in a model for image-caption generation, TPGN outperforms LSTM baselines when evaluated on the COCO dataset. The TPR-capable structure enables interpretation of internal representations and operations, which prove to contain considerable grammatical content. Our caption-generation model can be interpreted as generating sequences of grammatical categories and retrieving words by their categories from a plan encoded as a distributed representation.
Anthology ID:
N18-1114
Volume:
Proceedings of the 2018 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long Papers)
Month:
June
Year:
2018
Address:
New Orleans, Louisiana
Editors:
Marilyn Walker, Heng Ji, Amanda Stent
Venue:
NAACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
1263–1273
Language:
URL:
https://aclanthology.org/N18-1114
DOI:
10.18653/v1/N18-1114
Bibkey:
Cite (ACL):
Qiuyuan Huang, Paul Smolensky, Xiaodong He, Li Deng, and Dapeng Wu. 2018. Tensor Product Generation Networks for Deep NLP Modeling. In Proceedings of the 2018 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long Papers), pages 1263–1273, New Orleans, Louisiana. Association for Computational Linguistics.
Cite (Informal):
Tensor Product Generation Networks for Deep NLP Modeling (Huang et al., NAACL 2018)
Copy Citation:
PDF:
https://aclanthology.org/N18-1114.pdf
Code
 additional community code