SOTAVerified

Siamese CBOW: Optimizing Word Embeddings for Sentence Representations

2016-06-15ACL 2016Code Available0· sign in to hype

Tom Kenter, Alexey Borisov, Maarten de Rijke

Code Available — Be the first to reproduce this paper.

Reproduce

Code

Abstract

We present the Siamese Continuous Bag of Words (Siamese CBOW) model, a neural network for efficient estimation of high-quality sentence embeddings. Averaging the embeddings of words in a sentence has proven to be a surprisingly successful and efficient way of obtaining sentence embeddings. However, word embeddings trained with the methods currently available are not optimized for the task of sentence representation, and, thus, likely to be suboptimal. Siamese CBOW handles this problem by training word embeddings directly for the purpose of being averaged. The underlying neural network learns word embeddings by predicting, from a sentence representation, its surrounding sentences. We show the robustness of the Siamese CBOW model by evaluating it on 20 datasets stemming from a wide variety of sources.

Tasks

Reproductions