Sebastian Rudolph


2017

pdf bib
Gradual Learning of Matrix-Space Models of Language for Sentiment Analysis
Shima Asaadi | Sebastian Rudolph
Proceedings of the 2nd Workshop on Representation Learning for NLP

Learning word representations to capture the semantics and compositionality of language has received much research interest in natural language processing. Beyond the popular vector space models, matrix representations for words have been proposed, since then, matrix multiplication can serve as natural composition operation. In this work, we investigate the problem of learning matrix representations of words. We present a learning approach for compositional matrix-space models for the task of sentiment analysis. We show that our approach, which learns the matrices gradually in two steps, outperforms other approaches and a gradient-descent baseline in terms of quality and computational cost.

2016

pdf bib
On the Correspondence between Compositional Matrix-Space Models of Language and Weighted Automata
Shima Asaadi | Sebastian Rudolph
Proceedings of the SIGFSM Workshop on Statistical NLP and Weighted Automata

2010

pdf bib
Compositional Matrix-Space Models of Language
Sebastian Rudolph | Eugenie Giesbrecht
Proceedings of the 48th Annual Meeting of the Association for Computational Linguistics