Abstractive Text Summarization
Abstractive Text Summarization is the task of generating a short and concise summary that captures the salient ideas of the source text. The generated summaries potentially contain new phrases and sentences that may not appear in the source text.
Source: Generative Adversarial Network for Abstractive Text Summarization
Image credit: Abstractive Text Summarization using Sequence-to-sequence RNNs and Beyond
Papers
Showing 1–10 of 846 papers
All datasetsCNN / Daily MailAbstractive Text Summarization from Il PostvietnewsAbstractive Text Summarization from FanpageEDUsumCNN/Daily MailMLSum-itWITSAESLCeLifeInshorts NewsMLSUM de
Benchmark Results
| # | Model | Metric | Claimed | Verified | Status |
|---|---|---|---|---|---|
| 1 | Scrambled code + broken (alter) | ROUGE-1 | 48.18 | — | Unverified |
| 2 | BRIO | ROUGE-1 | 47.78 | — | Unverified |
| 3 | Pegasus | ROUGE-1 | 47.36 | — | Unverified |
| 4 | PEGASUS + SummaReranker | ROUGE-1 | 47.16 | — | Unverified |
| 5 | Scrambled code + broken | ROUGE-1 | 46.71 | — | Unverified |
| 6 | BART + SimCLS | ROUGE-1 | 46.67 | — | Unverified |
| 7 | SEASON | ROUGE-1 | 46.27 | — | Unverified |
| 8 | Fourier Transformer | ROUGE-1 | 44.76 | — | Unverified |
| 9 | GLM-XXLarge | ROUGE-1 | 44.7 | — | Unverified |
| 10 | BART + R-Drop | ROUGE-1 | 44.51 | — | Unverified |
| # | Model | Metric | Claimed | Verified | Status |
|---|---|---|---|---|---|
| 1 | mBART | ROUGE-1 | 38.91 | — | Unverified |
| 2 | mBART | ROUGE-1 | 38.91 | — | Unverified |
| 3 | BART-IT | ROUGE-1 | 37.31 | — | Unverified |
| 4 | mT5 | ROUGE-1 | 35.04 | — | Unverified |
| 5 | IT5 | ROUGE-1 | 33.78 | — | Unverified |
| 6 | IT5-base | ROUGE-1 | 32.88 | — | Unverified |
| 7 | Pegasus-CNN/DM (eng-it translation) | ROUGE-1 | 23.96 | — | Unverified |
| 8 | Pegasus-XSum (eng-it translation) | ROUGE-1 | 21.03 | — | Unverified |
| # | Model | Metric | Claimed | Verified | Status |
|---|---|---|---|---|---|
| 1 | Kết quả nghiên cứu | Rouge-1 | 67.8 | — | Unverified |
| 2 | ViT5 large | Rouge-1 | 63.37 | — | Unverified |
| 3 | ViT5 base | Rouge-1 | 62.77 | — | Unverified |
| 4 | BARTpho | Rouge-1 | 61.14 | — | Unverified |
| 5 | mBART | Rouge-1 | 59.81 | — | Unverified |
| 6 | mT5 | Rouge-1 | 58.05 | — | Unverified |
| 7 | Transformer | Rouge-1 | 57.56 | — | Unverified |
| # | Model | Metric | Claimed | Verified | Status |
|---|---|---|---|---|---|
| 1 | mBART | ROUGE-1 | 36.52 | — | Unverified |
| 2 | mBART | ROUGE-1 | 36.5 | — | Unverified |
| 3 | BART-IT | ROUGE-1 | 35.42 | — | Unverified |
| 4 | mT5 | ROUGE-1 | 34.13 | — | Unverified |
| 5 | IT5-base | ROUGE-1 | 33.99 | — | Unverified |
| 6 | IT5 | ROUGE-1 | 33.83 | — | Unverified |
| # | Model | Metric | Claimed | Verified | Status |
|---|---|---|---|---|---|
| 1 | GP_Step_Sim | ROUGE-1 | 64.48 | — | Unverified |
| 2 | NEZHA | ROUGE-1 | 63.91 | — | Unverified |
| 3 | RoBERTa | ROUGE-1 | 63.22 | — | Unverified |
| 4 | BERT | ROUGE-1 | 62.37 | — | Unverified |
| 5 | Seq2seq | ROUGE-1 | 48.62 | — | Unverified |
| # | Model | Metric | Claimed | Verified | Status |
|---|---|---|---|---|---|
| 1 | BART (TextBox 2.0) | ROUGE-1 | 44.47 | — | Unverified |
| 2 | SEGMENT | ROUGE-1 | 42.1 | — | Unverified |
| 3 | CriSPO 3-shot | ROUGE-1 | 42.1 | — | Unverified |
| 4 | Claude Instant + SigExt | ROUGE-1 | 42 | — | Unverified |
| # | Model | Metric | Claimed | Verified | Status |
|---|---|---|---|---|---|
| 1 | mBART | rouge1 | 19.35 | — | Unverified |
| 2 | IT5 | rouge1 | 19.29 | — | Unverified |
| 3 | Pegasus-CNN/DM (eng-it translation) | rouge1 | 16.97 | — | Unverified |
| 4 | Pegasus-XSum (eng-it translation) | rouge1 | 15.17 | — | Unverified |
| # | Model | Metric | Claimed | Verified | Status |
|---|---|---|---|---|---|
| 1 | BART-IT | ROUGE-1 | 42.32 | — | Unverified |
| 2 | mT5 | ROUGE-1 | 40.6 | — | Unverified |
| 3 | mBART | ROUGE-1 | 39.32 | — | Unverified |
| 4 | IT5-base | ROUGE-1 | 37.98 | — | Unverified |
| # | Model | Metric | Claimed | Verified | Status |
|---|---|---|---|---|---|
| 1 | PEGASUS | ROUGE-1 | 37.68 | — | Unverified |
| 2 | Multi-Stage Extractor/Abstractor | ROUGE-1 | 23.67 | — | Unverified |
| # | Model | Metric | Claimed | Verified | Status |
|---|---|---|---|---|---|
| 1 | Two stage LLMs | Test ROGUE-1 | 0.46 | — | Unverified |
| # | Model | Metric | Claimed | Verified | Status |
|---|---|---|---|---|---|
| 1 | T2SAM | ROUGE | 48.5 | — | Unverified |
| # | Model | Metric | Claimed | Verified | Status |
|---|---|---|---|---|---|
| 1 | mBART | METEOR | 0.44 | — | Unverified |
| # | Model | Metric | Claimed | Verified | Status |
|---|---|---|---|---|---|
| 1 | mBART | METEOR | 0.21 | — | Unverified |
| # | Model | Metric | Claimed | Verified | Status |
|---|---|---|---|---|---|
| 1 | Two stage LLMs | Test ROGUE-1 | 0.44 | — | Unverified |
| # | Model | Metric | Claimed | Verified | Status |
|---|---|---|---|---|---|
| 1 | BertSum | Content F1 | 29.8 | — | Unverified |