Deep Graph Convolutional Encoders for Structured Data to Text Generation
2018-10-23WS 2018Code Available1· sign in to hype
Diego Marcheggiani, Laura Perez-Beltrachini
Code Available — Be the first to reproduce this paper.
ReproduceCode
- github.com/diegma/graph-2-textOfficialIn paperpytorch★ 153
- github.com/dice-group/NABUtf★ 8
Abstract
Most previous work on neural text generation from graph-structured data relies on standard sequence-to-sequence methods. These approaches linearise the input graph to be fed to a recurrent neural network. In this paper, we propose an alternative encoder based on graph convolutional networks that directly exploits the input structure. We report results on two graph-to-sequence datasets that empirically show the benefits of explicitly encoding the input graph structure.
Tasks
Benchmark Results
| Dataset | Model | Metric | Claimed | Verified | Status |
|---|---|---|---|---|---|
| SR11Deep | GCN + feat | BLEU | 0.67 | — | Unverified |
| WebNLG | GCN EC | BLEU | 55.9 | — | Unverified |