Data-to-Text Generation
A classic problem in natural-language generation (NLG) involves taking structured data, such as a table, as input, and producing text that adequately and fluently describes this data as output. Unlike machine translation, which aims for complete transduction of the sentence to be translated, this form of NLG is usually taken to require addressing (at least) two separate challenges: what to say, the selection of an appropriate subset of the input data to discuss, and how to say it, the surface realization of a generation.
( Image credit: Data-to-Text Generation with Content Selection and Planning )
Papers
Showing 1–10 of 219 papers
All datasetsWebNLGE2E NLG ChallengeWebNLG FullCleaned E2E NLG ChallengeRotoWireRotoWire (Relation Generation)ToTToXAlignDARTMULTIWOZ 2.1RotoWire (Content Ordering)Rotowire (Content Selection)
Benchmark Results
| # | Model | Metric | Claimed | Verified | Status |
|---|---|---|---|---|---|
| 1 | Control Prefixes (A1, T5-large) | BLEU | 67.32 | — | Unverified |
| 2 | Control Prefixes (A1, A2, T5-large) | BLEU | 67.15 | — | Unverified |
| 3 | JointGT Baseline | BLEU | 67.08 | — | Unverified |
| 4 | FactT5B | BLEU | 67.04 | — | Unverified |
| 5 | T5B Baseline | BLEU | 67.04 | — | Unverified |
| 6 | FactJointGT | BLEU | 66.89 | — | Unverified |
| 7 | T5-large + Wiki + Position | BLEU | 66.07 | — | Unverified |
| 8 | HTML (fine-tuning) | BLEU | 65.4 | — | Unverified |
| 9 | T5-small | BLEU | 65.05 | — | Unverified |
| 10 | TrICy (trK = trk* = 0.24) | BLEU | 64.73 | — | Unverified |
| # | Model | Metric | Claimed | Verified | Status |
|---|---|---|---|---|---|
| 1 | S_1^R | BLEU | 68.6 | — | Unverified |
| 2 | EDA_CS | BLEU | 67.05 | — | Unverified |
| 3 | TrICy (trK = 0) | BLEU | 66.43 | — | Unverified |
| 4 | Slug | BLEU | 66.19 | — | Unverified |
| 5 | TGen | BLEU | 65.93 | — | Unverified |
| 6 | EDA_CS (TL) | BLEU | 65.8 | — | Unverified |
| 7 | Sys1-Primary | BLEU | 65.61 | — | Unverified |
| 8 | Zhang | BLEU | 65.45 | — | Unverified |
| 9 | Self-memory | BLEU | 65.11 | — | Unverified |
| 10 | Gong | BLEU | 64.22 | — | Unverified |
| # | Model | Metric | Claimed | Verified | Status |
|---|---|---|---|---|---|
| 1 | Control Prefixes (A1, A2, T5-large) | BLEU | 62.27 | — | Unverified |
| 2 | Control Prefixes (A1, T5-large) | BLEU | 61.94 | — | Unverified |
| 3 | T5-large + Wiki + Position | BLEU | 60.56 | — | Unverified |
| 4 | T5-large | BLEU | 59.7 | — | Unverified |
| 5 | T5-Large | BLEU | 57.1 | — | Unverified |
| 6 | HTLM (prefix 0.1%) | BLEU | 56.3 | — | Unverified |
| 7 | DATATUNER_NO_FC | BLEU | 52.9 | — | Unverified |
| 8 | Transformer (Pipeline) | BLEU | 51.68 | — | Unverified |
| # | Model | Metric | Claimed | Verified | Status |
|---|---|---|---|---|---|
| 1 | Control Prefixes (T5-large) | BLEU (Test set) | 44.15 | — | Unverified |
| 2 | DataTuner_FC | BLEU (Test set) | 43.6 | — | Unverified |
| 3 | TGen | BLEU (Test set) | 40.73 | — | Unverified |
| 4 | LSTM | METEOR (Validation set) | 0.39 | — | Unverified |
| 5 | TGen | METEOR (Validation set) | 0.39 | — | Unverified |
| 6 | BART | METEOR (Validation set) | 0.37 | — | Unverified |
| 7 | T5 | METEOR (Validation set) | 0.37 | — | Unverified |
| # | Model | Metric | Claimed | Verified | Status |
|---|---|---|---|---|---|
| 1 | HierarchicalEncoder + NR + IR | BLEU | 17.96 | — | Unverified |
| 2 | Hierarchical transformer encoder + conditional copy | BLEU | 17.5 | — | Unverified |
| 3 | Force-Copy | BLEU | 17.26 | — | Unverified |
| 4 | Neural Content Planning + conditional copy | BLEU | 16.5 | — | Unverified |
| 5 | Macro | BLEU | 15.46 | — | Unverified |
| 6 | Encoder-decoder + conditional copy | BLEU | 14.19 | — | Unverified |
| # | Model | Metric | Claimed | Verified | Status |
|---|---|---|---|---|---|
| 1 | SeqPlan | Precision | 97.6 | — | Unverified |
| 2 | Macro | Precision | 97.6 | — | Unverified |
| 3 | Force-Copy | Precision | 95.4 | — | Unverified |
| 4 | Hierarchical Transformer Encoder + conditional copy | Precision | 89.46 | — | Unverified |
| 5 | Neural Content Planning + conditional copy | Precision | 87.47 | — | Unverified |
| 6 | Encoder-decoder + conditional copy | Precision | 74.8 | — | Unverified |
| # | Model | Metric | Claimed | Verified | Status |
|---|---|---|---|---|---|
| 1 | T5-3B | BLEU | 49.5 | — | Unverified |
| 2 | LATTICE (T5-base) | BLEU | 48.4 | — | Unverified |
| 3 | BERT-to-BERT | BLEU | 44 | — | Unverified |
| 4 | Pointer Generator | BLEU | 41.6 | — | Unverified |
| 5 | NCP+CC (Puduppully et al 2019) | BLEU | 19.2 | — | Unverified |
| 6 | T5 | METEOR | 0.36 | — | Unverified |
| # | Model | Metric | Claimed | Verified | Status |
|---|---|---|---|---|---|
| 1 | Fact-aware embedding with mT5 | BLEU4 | 29.27 | — | Unverified |
| 2 | Bi-lingual mT5 | BLEU4 | 25.88 | — | Unverified |
| 3 | mT5 | BLEU4 | 25 | — | Unverified |
| 4 | Vanilla Transformer | BLEU4 | 19.9 | — | Unverified |
| 5 | Translate-Output mT5 | BLEU4 | 18.91 | — | Unverified |
| 6 | Graph Attention Network Encoder +Transformer Decoder | BLEU4 | 18.3 | — | Unverified |
| # | Model | Metric | Claimed | Verified | Status |
|---|---|---|---|---|---|
| 1 | T5B Baseline | BLEU | 48.47 | — | Unverified |
| 2 | FactT5B | BLEU | 48.37 | — | Unverified |
| 3 | self-mem + new data | BLEU | 47.76 | — | Unverified |
| 4 | JointGT Baseline | BLEU | 47.51 | — | Unverified |
| 5 | FactJointGT | BLEU | 47.39 | — | Unverified |
| # | Model | Metric | Claimed | Verified | Status |
|---|---|---|---|---|---|
| 1 | Hierarchical Transformer Encoder + conditional copy | DLD | 18.9 | — | Unverified |
| 2 | Neural Content Planning + conditional copy | DLD | 18.58 | — | Unverified |
| 3 | Macro | DLD | 17.7 | — | Unverified |
| 4 | Force-Copy | DLD | 17.26 | — | Unverified |
| 5 | Encoder-decoder + conditional copy | DLD | 8.68 | — | Unverified |
| # | Model | Metric | Claimed | Verified | Status |
|---|---|---|---|---|---|
| 1 | Hierarchical Transformer Encoder + conditional copy | Precision | 39.47 | — | Unverified |
| 2 | Force-Copy | Precision | 34.34 | — | Unverified |
| 3 | Neural Content Planning + conditional copy | Precision | 34.18 | — | Unverified |
| 4 | Macro | Precision | 34.1 | — | Unverified |
| 5 | Encoder-decoder + conditional copy | Precision | 29.49 | — | Unverified |
| # | Model | Metric | Claimed | Verified | Status |
|---|---|---|---|---|---|
| 1 | SeqPlan | BLEU | 14.29 | — | Unverified |
| 2 | Macro | BLEU | 12.62 | — | Unverified |
| 3 | ENT | BLEU | 11.5 | — | Unverified |
| 4 | Force-Copy | BLEU | 10.5 | — | Unverified |
| # | Model | Metric | Claimed | Verified | Status |
|---|---|---|---|---|---|
| 1 | SeqPlan | DLD | 22.7 | — | Unverified |
| 2 | Macro | DLD | 21.8 | — | Unverified |
| 3 | Force-Copy | DLD | 21.16 | — | Unverified |
| 4 | ENT | DLD | 20.7 | — | Unverified |
| # | Model | Metric | Claimed | Verified | Status |
|---|---|---|---|---|---|
| 1 | SeqPlan | Precision | 95.9 | — | Unverified |
| 2 | Macro | Precision | 94.4 | — | Unverified |
| 3 | Force-Copy | Precision | 84.5 | — | Unverified |
| 4 | ENT | Precision | 81.1 | — | Unverified |
| # | Model | Metric | Claimed | Verified | Status |
|---|---|---|---|---|---|
| 1 | Force-Copy | Precision | 49.39 | — | Unverified |
| 2 | SeqPlan | Precision | 43.3 | — | Unverified |
| 3 | Macro | Precision | 40.8 | — | Unverified |
| # | Model | Metric | Claimed | Verified | Status |
|---|---|---|---|---|---|
| 1 | self-mem + new data (random) | METEOR | 46.11 | — | Unverified |
| 2 | self-mem + new data (fixed) | METEOR | 46.07 | — | Unverified |
| # | Model | Metric | Claimed | Verified | Status |
|---|---|---|---|---|---|
| 1 | Transition based Deep Input Linearization | BLEU | 80.49 | — | Unverified |
| 2 | GCN + feat | BLEU | 0.67 | — | Unverified |
| # | Model | Metric | Claimed | Verified | Status |
|---|---|---|---|---|---|
| 1 | DataTuner_FC | BLEU | 53.6 | — | Unverified |
| 2 | Bo3 | BLEU | 52.1 | — | Unverified |
| # | Model | Metric | Claimed | Verified | Status |
|---|---|---|---|---|---|
| 1 | StructAdapt | Bleu | 48 | — | Unverified |
| # | Model | Metric | Claimed | Verified | Status |
|---|---|---|---|---|---|
| 1 | T5-large | BLEU | 45.85 | — | Unverified |
| # | Model | Metric | Claimed | Verified | Status |
|---|---|---|---|---|---|
| 1 | T5-large | BLEU | 69.27 | — | Unverified |
| # | Model | Metric | Claimed | Verified | Status |
|---|---|---|---|---|---|
| 1 | Ours | BLEU | 24.56 | — | Unverified |