A Graph-to-Sequence Model for AMR-to-Text Generation
Linfeng Song, Yue Zhang, Zhiguo Wang, Daniel Gildea
Code Available — Be the first to reproduce this paper.
ReproduceCode
- github.com/freesunshine0316/neural-graph-to-seq-mpOfficialIn papertf★ 0
Abstract
The problem of AMR-to-text generation is to recover a text representing the same meaning as an input AMR graph. The current state-of-the-art method uses a sequence-to-sequence model, leveraging LSTM for encoding a linearized AMR structure. Although being able to model non-local semantic information, a sequence LSTM can lose information from the AMR graph structure, and thus faces challenges with large graphs, which result in long sequences. We introduce a neural graph-to-sequence model, using a novel LSTM structure for directly encoding graph-level semantics. On a standard benchmark, our model shows superior results to existing methods in the literature.
Tasks
Benchmark Results
| Dataset | Model | Metric | Claimed | Verified | Status |
|---|---|---|---|---|---|
| LDC2015E86 | GRN | BLEU | 33.6 | — | Unverified |