SOTAVerified

Syntax-Aware Multi-Sense Word Embeddings for Deep Compositional Models of Meaning

2015-08-10EMNLP 2015Unverified0· sign in to hype

Jianpeng Cheng, Dimitri Kartsaklis

Unverified — Be the first to reproduce this paper.

Reproduce

Abstract

Deep compositional models of meaning acting on distributional representations of words in order to produce vectors of larger text constituents are evolving to a popular area of NLP research. We detail a compositional distributional framework based on a rich form of word embeddings that aims at facilitating the interactions between words in the context of a sentence. Embeddings and composition layers are jointly learned against a generic objective that enhances the vectors with syntactic information from the surrounding context. Furthermore, each word is associated with a number of senses, the most plausible of which is selected dynamically during the composition process. We evaluate the produced vectors qualitatively and quantitatively with positive results. At the sentence level, the effectiveness of the framework is demonstrated on the MSRPar task, for which we report results within the state-of-the-art range.

Tasks

Reproductions