Document Summarization
Automatic Document Summarization is the task of rewriting a document into its shorter form while still retaining its important content. The most popular two paradigms are extractive approaches and abstractive approaches. Extractive approaches generate summaries by extracting parts of the original document (usually sentences), while abstractive methods may generate new words or phrases which are not in the original document.
Papers
Showing 1–10 of 760 papers
All datasetsCNN / Daily MailHowSumm-StepHowSumm-MethodArxiv HEP-TH citation grapharXiv Summarization DatasetBBC XSumWikiLingua (tr->en)
Benchmark Results
| # | Model | Metric | Claimed | Verified | Status |
|---|---|---|---|---|---|
| 1 | HAT-BART | ROUGE-1 | 44.48 | — | Unverified |
| 2 | MatchSum (RoBERTa-base) | ROUGE-1 | 44.41 | — | Unverified |
| 3 | Hie-BART | ROUGE-1 | 44.35 | — | Unverified |
| 4 | MatchSum (BERT-base) | ROUGE-1 | 44.22 | — | Unverified |
| 5 | BertSumExt | ROUGE-1 | 43.85 | — | Unverified |
| 6 | BigBird-Pegasus | ROUGE-1 | 43.84 | — | Unverified |
| 7 | T5-11B | ROUGE-1 | 43.52 | — | Unverified |
| 8 | BERTSUM+Transformer | ROUGE-1 | 43.25 | — | Unverified |
| 9 | UniLM (Abstractive Summarization) | ROUGE-1 | 43.08 | — | Unverified |
| 10 | Selector+Pointer Generator | ROUGE-1 | 41.72 | — | Unverified |
| # | Model | Metric | Claimed | Verified | Status |
|---|---|---|---|---|---|
| 1 | LexRank (query: step title) | ROUGE-1 | 39.6 | — | Unverified |
| 2 | CES (query: step title) | ROUGE-1 | 39.3 | — | Unverified |
| 3 | CES (query: step + method titles) | ROUGE-1 | 38.3 | — | Unverified |
| 4 | LexRank (query: step + method titles) | ROUGE-1 | 38.2 | — | Unverified |
| 5 | CES (query: step + method + article titles) | ROUGE-1 | 37 | — | Unverified |
| 6 | LexRank (query: step + method + article titles) | ROUGE-1 | 36.3 | — | Unverified |
| 7 | GreedyRel (query: step + method titles) | ROUGE-1 | 30.3 | — | Unverified |
| 8 | GreedyRel (query: step title) | ROUGE-1 | 30.1 | — | Unverified |
| 9 | BM25-HierSumm (query: step + method titles) | ROUGE-1 | 23 | — | Unverified |
| 10 | BM25-HierSumm (query: step title) | ROUGE-1 | 22.3 | — | Unverified |
| # | Model | Metric | Claimed | Verified | Status |
|---|---|---|---|---|---|
| 1 | LexRank (query: method + article + steps titles) | ROUGE-1 | 53.5 | — | Unverified |
| 2 | CES (query: method + article + steps titles) | ROUGE-1 | 52.2 | — | Unverified |
| 3 | GreedyRel (query: method + article + steps titles) | ROUGE-1 | 48.6 | — | Unverified |
| 4 | CES (query: method title) | ROUGE-1 | 48.4 | — | Unverified |
| 5 | CES (query: method + article titles) | ROUGE-1 | 48.3 | — | Unverified |
| 6 | LexRank (query: method title) | ROUGE-1 | 47.7 | — | Unverified |
| 7 | LexRank (query: method + article titles) | ROUGE-1 | 47.1 | — | Unverified |
| 8 | GreedyRel (query: method title) | ROUGE-1 | 43.4 | — | Unverified |
| 9 | GreedyRel (query: method + article titles) | ROUGE-1 | 42.3 | — | Unverified |
| # | Model | Metric | Claimed | Verified | Status |
|---|---|---|---|---|---|
| 1 | DeepPyramidion | ROUGE-1 | 47.15 | — | Unverified |
| # | Model | Metric | Claimed | Verified | Status |
|---|---|---|---|---|---|
| 1 | DeepPyramidion | Rouge-2 | 19.99 | — | Unverified |
| # | Model | Metric | Claimed | Verified | Status |
|---|---|---|---|---|---|
| 1 | BigBird-Pegasus | ROUGE-1 | 47.12 | — | Unverified |
| # | Model | Metric | Claimed | Verified | Status |
|---|---|---|---|---|---|
| 1 | DOCmT5 | Rouge-L | 31.37 | — | Unverified |