AMR Parsing as Sequence-to-Graph Transduction
2019-05-21ACL 2019Code Available1· sign in to hype
Sheng Zhang, Xutai Ma, Kevin Duh, Benjamin Van Durme
Code Available — Be the first to reproduce this paper.
ReproduceCode
- github.com/sheng-z/stogOfficialIn paperpytorch★ 156
Abstract
We propose an attention-based model that treats AMR parsing as sequence-to-graph transduction. Unlike most AMR parsers that rely on pre-trained aligners, external semantic resources, or data augmentation, our proposed parser is aligner-free, and it can be effectively trained with limited amounts of labeled AMR data. Our experimental results outperform all previously reported SMATCH scores, on both AMR 2.0 (76.3% F1 on LDC2017T10) and AMR 1.0 (70.2% F1 on LDC2014T12).
Tasks
Benchmark Results
| Dataset | Model | Metric | Claimed | Verified | Status |
|---|---|---|---|---|---|
| LDC2014T12 | Two-stage Sequence-to-Graph Transducer | F1 Full | 70.2 | — | Unverified |
| LDC2014T12 | Sequence-to-Graph Transduction | F1 Newswire | 0.75 | — | Unverified |
| LDC2017T10 | Sequence-to-Graph Transduction | Smatch | 76.3 | — | Unverified |