SOTAVerified

Disambiguated skip-gram model

2018-10-01EMNLP 2018Unverified0· sign in to hype

Karol Grzegorczyk, Marcin Kurdziel

Unverified — Be the first to reproduce this paper.

Reproduce

Abstract

We present disambiguated skip-gram: a neural-probabilistic model for learning multi-sense distributed representations of words. Disambiguated skip-gram jointly estimates a skip-gram-like context word prediction model and a word sense disambiguation model. Unlike previous probabilistic models for learning multi-sense word embeddings, disambiguated skip-gram is end-to-end differentiable and can be interpreted as a simple feed-forward neural network. We also introduce an effective pruning strategy for the embeddings learned by disambiguated skip-gram. This allows us to control the granularity of representations learned by our model. In experimental evaluation disambiguated skip-gram improves state-of-the are results in several word sense induction benchmarks.

Tasks

Reproductions