Corrected CBOW Performs as well as Skip-gram
2020-12-30EMNLP (insights) 2021Code Available1· sign in to hype
Ozan İrsoy, Adrian Benton, Karl Stratos
Code Available — Be the first to reproduce this paper.
ReproduceCode
- github.com/bloomberg/koanOfficialIn papernone★ 261
Abstract
Mikolov et al. (2013a) observed that continuous bag-of-words (CBOW) word embeddings tend to underperform Skip-gram (SG) embeddings, and this finding has been reported in subsequent works. We find that these observations are driven not by fundamental differences in their training objectives, but more likely on faulty negative sampling CBOW implementations in popular libraries such as the official implementation, word2vec.c, and Gensim. We show that after correcting a bug in the CBOW gradient update, one can learn CBOW word embeddings that are fully competitive with SG on various intrinsic and extrinsic tasks, while being many times faster to train.