SOTAVerified

Text Generation

Text Generation is the task of generating text with the goal of appearing indistinguishable to human-written text. This task is more formally known as "natural language generation" in the literature.

Text generation can be addressed with Markov processes or deep generative models like LSTMs. Recently, some of the most advanced methods for text generation include BART, GPT and other GAN-based approaches. Text generation systems are evaluated either through human ratings or automatic evaluation metrics like METEOR, ROUGE, and BLEU.

Further readings:

( Image credit: Adversarial Ranking for Language Generation )

Papers

Showing 110 of 5335 papers

TitleStatusHype
Making Language Model a Hierarchical Classifier and GeneratorCode0
Mitigating Object Hallucinations via Sentence-Level Early InterventionCode1
Seq vs Seq: An Open Suite of Paired Encoders and DecodersCode2
The Devil behind the mask: An emergent safety vulnerability of Diffusion LLMsCode2
Hashed Watermark as a Filter: Defeating Forging and Overwriting Attacks in Weight-based Neural Network WatermarkingCode0
Exploiting Leaderboards for Large-Scale Distribution of Malicious Models0
FIFA: Unified Faithfulness Evaluation Framework for Text-to-Video and Video-to-Text Generation0
CLI-RAG: A Retrieval-Augmented Framework for Clinically Structured and Context Aware Text Generation with LLMs0
Advancing Offline Handwritten Text Recognition: A Systematic Review of Data Augmentation and Generation Techniques0
OpenFActScore: Open-Source Atomic Evaluation of Factuality in Text GenerationCode0
Show:102550
← PrevPage 1 of 534Next →

Benchmark Results

#ModelMetricClaimedVerifiedStatus
1Beam search + A*esque (beam)BLEU-134.4Unverified
2Beam search + A*esque (sample)BLEU-134.4Unverified
3Beam search + A*esque (greedy)BLEU-134.3Unverified
4Beam searchBLEU-133.7Unverified