SOTAVerified

Resolving Prepositional Phrase Attachment Ambiguities with Contextualized Word Embeddings

2021-12-01ICON 2021Code Available0· sign in to hype

Adwait Ratnaparkhi, Atul Kumar

Code Available — Be the first to reproduce this paper.

Reproduce

Code

Abstract

This paper applies contextualized word embedding models to a long-standing problem in the natural language parsing community, namely prepositional phrase attachment. Following past formulations of this problem, we use data sets in which the attachment decision is both a binary-valued choice as well as a multi-valued choice. We present a deep learning architecture that fine-tunes the output of a contextualized word embedding model for the purpose of predicting attachment decisions. We present experiments on two commonly used datasets that outperform the previous best results, using only the original training data and the unannotated full sentence context.

Tasks

Reproductions