SOTAVerified

A Comparison of Context-sensitive Models for Lexical Substitution

2019-05-01WS 2019Unverified0· sign in to hype

Aina Gar{\'\i} Soler, Anne Cocos, Marianna Apidianaki, Chris Callison-Burch

Unverified — Be the first to reproduce this paper.

Reproduce

Abstract

Word embedding representations provide good estimates of word meaning and give state-of-the art performance in semantic tasks. Embedding approaches differ as to whether and how they account for the context surrounding a word. We present a comparison of different word and context representations on the task of proposing substitutes for a target word in context (lexical substitution). We also experiment with tuning contextualized word embeddings on a dataset of sense-specific instances for each target word. We show that powerful contextualized word representations, which give high performance in several semantics-related tasks, deal less well with the subtle in-context similarity relationships needed for substitution. This is better handled by models trained with this objective in mind, where the inter-dependence between word and context representations is explicitly modeled during training.

Tasks

Reproductions