SOTAVerified

Second-Order Neural Dependency Parsing with Message Passing and End-to-End Training

2020-10-10Asian Chapter of the Association for Computational LinguisticsCode Available1· sign in to hype

Xinyu Wang, Kewei Tu

Code Available — Be the first to reproduce this paper.

Reproduce

Code

Abstract

In this paper, we propose second-order graph-based neural dependency parsing using message passing and end-to-end neural networks. We empirically show that our approaches match the accuracy of very recent state-of-the-art second-order graph-based neural dependency parsers and have significantly faster speed in both training and testing. We also empirically show the advantage of second-order parsing over first-order parsing and observe that the usefulness of the head-selection structured constraint vanishes when using BERT embedding.

Tasks

Benchmark Results

DatasetModelMetricClaimedVerifiedStatus
Chinese TreebankMFVILAS91.69Unverified
Penn TreebankMFVILAS95.34Unverified

Reproductions