SOTAVerified

Globally Normalized Transition-Based Neural Networks

2016-03-19ACL 2016Code Available0· sign in to hype

Daniel Andor, Chris Alberti, David Weiss, Aliaksei Severyn, Alessandro Presta, Kuzman Ganchev, Slav Petrov, Michael Collins

Code Available — Be the first to reproduce this paper.

Reproduce

Code

Abstract

We introduce a globally normalized transition-based neural network model that achieves state-of-the-art part-of-speech tagging, dependency parsing and sentence compression results. Our model is a simple feed-forward neural network that operates on a task-specific transition system, yet achieves comparable or better accuracies than recurrent models. We discuss the importance of global as opposed to local normalization: a key insight is that the label bias problem implies that globally normalized models can be strictly more expressive than locally normalized models.

Tasks

Benchmark Results

DatasetModelMetricClaimedVerifiedStatus
Penn TreebankAndor et al.LAS92.79Unverified

Reproductions