SOTAVerified

Learning Word Embeddings without Context Vectors

2019-08-01WS 2019Unverified0· sign in to hype

Alexey Zobnin, Evgenia Elistratova

Unverified — Be the first to reproduce this paper.

Reproduce

Abstract

Most word embedding algorithms such as word2vec or fastText construct two sort of vectors: for words and for contexts. Naive use of vectors of only one sort leads to poor results. We suggest using indefinite inner product in skip-gram negative sampling algorithm. This allows us to use only one sort of vectors without loss of quality. Our ``context-free'' cf algorithm performs on par with SGNS on word similarity datasets

Tasks

Reproductions