SOTAVerified

Learning Word Representations with Hierarchical Sparse Coding

2014-06-08Unverified0· sign in to hype

Dani Yogatama, Manaal Faruqui, Chris Dyer, Noah A. Smith

Unverified — Be the first to reproduce this paper.

Reproduce

Abstract

We propose a new method for learning word representations using hierarchical regularization in sparse coding inspired by the linguistic study of word meanings. We show an efficient learning algorithm based on stochastic proximal methods that is significantly faster than previous approaches, making it possible to perform hierarchical sparse coding on a corpus of billions of word tokens. Experiments on various benchmark tasks---word similarity ranking, analogies, sentence completion, and sentiment analysis---demonstrate that the method outperforms or is competitive with state-of-the-art methods. Our word representations are available at http://www.ark.cs.cmu.edu/dyogatam/wordvecs/.

Tasks

Reproductions