SOTAVerified

Prepositional Phrase Attachment over Word Embedding Products

2017-09-01WS 2017Unverified0· sign in to hype

Pranava Swaroop Madhyastha, Xavier Carreras, Ariadna Quattoni

Unverified — Be the first to reproduce this paper.

Reproduce

Abstract

We present a low-rank multi-linear model for the task of solving prepositional phrase attachment ambiguity (PP task). Our model exploits tensor products of word embeddings, capturing all possible conjunctions of latent embeddings. Our results on a wide range of datasets and task settings show that tensor products are the best compositional operation and that a relatively simple multi-linear model that uses only word embeddings of lexical features can outperform more complex non-linear architectures that exploit the same information. Our proposed model gives the current best reported performance on an out-of-domain evaluation and performs competively on out-of-domain dependency parsing datasets.

Tasks

Reproductions