SOTAVerified

Word Sense Induction with Neural biLM and Symmetric Patterns

2018-08-26EMNLP 2018Code Available0· sign in to hype

Asaf Amrami, Yoav Goldberg

Code Available — Be the first to reproduce this paper.

Reproduce

Code

Abstract

An established method for Word Sense Induction (WSI) uses a language model to predict probable substitutes for target words, and induces senses by clustering these resulting substitute vectors. We replace the ngram-based language model (LM) with a recurrent one. Beyond being more accurate, the use of the recurrent LM allows us to effectively query it in a creative way, using what we call dynamic symmetric patterns. The combination of the RNN-LM and the dynamic symmetric patterns results in strong substitute vectors for WSI, allowing to surpass the current state-of-the-art on the SemEval 2013 WSI shared task by a large margin.

Tasks

Reproductions