SOTAVerified

Text Infilling

Text Infilling is the task of predicting missing spans of text which are consistent with the preceding and subsequent text. Text Infilling is a generalization of the cloze task—cloze historically refers to infilling individual words.

Source: Enabling Language Models to Fill in the Blanks

Papers

Showing 1120 of 43 papers

TitleStatusHype
LOT: A Story-Centric Benchmark for Evaluating Chinese Long Text Understanding and GenerationCode1
Improving Sequence-to-Sequence Pre-training via Sequence Span RewritingCode1
Back to the Future: Unsupervised Backprop-based Decoding for Counterfactual and Abductive Commonsense ReasoningCode1
Enabling Language Models to Fill in the BlanksCode1
Flexible-length Text Infilling for Discrete Diffusion Models0
Insertion Language Models: Sequence Generation with Arbitrary-Position Insertions0
Enhancing Spoken Discourse Modeling in Language Models Using Gestural Cues0
TrajGPT: Controlled Synthetic Trajectory Generation Using a Multitask Transformer-Based Spatiotemporal ModelCode0
Empowering Character-level Text Infilling by Eliminating Sub-TokensCode0
Towards Probabilistically-Sound Beam Search with Masked Language ModelsCode0
Show:102550
← PrevPage 2 of 5Next →

No leaderboard results yet.