Directional Skip-Gram: Explicitly Distinguishing Left and Right Context for Word Embeddings

Yan Song, Shuming Shi, Jing Li, Haisong Zhang


Abstract
In this paper, we present directional skip-gram (DSG), a simple but effective enhancement of the skip-gram model by explicitly distinguishing left and right context in word prediction. In doing so, a direction vector is introduced for each word, whose embedding is thus learned by not only word co-occurrence patterns in its context, but also the directions of its contextual words. Theoretical and empirical studies on complexity illustrate that our model can be trained as efficient as the original skip-gram model, when compared to other extensions of the skip-gram model. Experimental results show that our model outperforms others on different datasets in semantic (word similarity measurement) and syntactic (part-of-speech tagging) evaluations, respectively.
Anthology ID:
N18-2028
Volume:
Proceedings of the 2018 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 2 (Short Papers)
Month:
June
Year:
2018
Address:
New Orleans, Louisiana
Editors:
Marilyn Walker, Heng Ji, Amanda Stent
Venue:
NAACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
175–180
Language:
URL:
https://aclanthology.org/N18-2028
DOI:
10.18653/v1/N18-2028
Bibkey:
Cite (ACL):
Yan Song, Shuming Shi, Jing Li, and Haisong Zhang. 2018. Directional Skip-Gram: Explicitly Distinguishing Left and Right Context for Word Embeddings. In Proceedings of the 2018 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 2 (Short Papers), pages 175–180, New Orleans, Louisiana. Association for Computational Linguistics.
Cite (Informal):
Directional Skip-Gram: Explicitly Distinguishing Left and Right Context for Word Embeddings (Song et al., NAACL 2018)
Copy Citation:
PDF:
https://aclanthology.org/N18-2028.pdf
Software:
 N18-2028.Software.zip