A Hierarchical Model for Data-to-Text Generation
Clément Rebuffel, Laure Soulier, Geoffrey Scoutheeten, Patrick Gallinari
Code Available — Be the first to reproduce this paper.
ReproduceCode
- github.com/KaijuML/data-to-text-hierarchicalOfficialIn paperpytorch★ 0
Abstract
Transcribing structured data into natural language descriptions has emerged as a challenging task, referred to as "data-to-text". These structures generally regroup multiple elements, as well as their attributes. Most attempts rely on translation encoder-decoder methods which linearize elements into a sequence. This however loses most of the structure contained in the data. In this work, we propose to overpass this limitation with a hierarchical model that encodes the data-structure at the element-level and the structure level. Evaluations on RotoWire show the effectiveness of our model w.r.t. qualitative and quantitative metrics.
Tasks
Benchmark Results
| Dataset | Model | Metric | Claimed | Verified | Status |
|---|---|---|---|---|---|
| RotoWire | Hierarchical transformer encoder + conditional copy | BLEU | 17.5 | — | Unverified |
| RotoWire (Content Ordering) | Hierarchical Transformer Encoder + conditional copy | DLD | 18.9 | — | Unverified |
| Rotowire (Content Selection) | Hierarchical Transformer Encoder + conditional copy | Precision | 39.47 | — | Unverified |
| RotoWire (Relation Generation) | Hierarchical Transformer Encoder + conditional copy | Precision | 89.46 | — | Unverified |