Second-Order Neural Dependency Parsing with Message Passing and End-to-End Training
2020-10-10Asian Chapter of the Association for Computational LinguisticsCode Available1· sign in to hype
Xinyu Wang, Kewei Tu
Code Available — Be the first to reproduce this paper.
ReproduceCode
- github.com/wangxinyu0922/Second_Order_ParsingOfficialIn paperpytorch★ 14
Abstract
In this paper, we propose second-order graph-based neural dependency parsing using message passing and end-to-end neural networks. We empirically show that our approaches match the accuracy of very recent state-of-the-art second-order graph-based neural dependency parsers and have significantly faster speed in both training and testing. We also empirically show the advantage of second-order parsing over first-order parsing and observe that the usefulness of the head-selection structured constraint vanishes when using BERT embedding.
Tasks
Benchmark Results
| Dataset | Model | Metric | Claimed | Verified | Status |
|---|---|---|---|---|---|
| Chinese Treebank | MFVI | LAS | 91.69 | — | Unverified |
| Penn Treebank | MFVI | LAS | 95.34 | — | Unverified |