SOTAVerified

Text Infilling

Text Infilling is the task of predicting missing spans of text which are consistent with the preceding and subsequent text. Text Infilling is a generalization of the cloze task—cloze historically refers to infilling individual words.

Source: Enabling Language Models to Fill in the Blanks

Papers

Showing 3140 of 43 papers

TitleStatusHype
LOT: A Story-Centric Benchmark for Evaluating Chinese Long Text Understanding and GenerationCode1
Nutri-bullets Hybrid: Consensual Multi-document Summarization0
Nutribullets Hybrid: Multi-document Health SummarizationCode0
InFillmore: Frame-Guided Language Generation with Bidirectional Context0
Improving Sequence-to-Sequence Pre-training via Sequence Span RewritingCode1
Reflective Decoding: Beyond Unidirectional Generation with Off-the-Shelf Language Models0
Back to the Future: Unsupervised Backprop-based Decoding for Counterfactual and Abductive Commonsense ReasoningCode1
Enabling Language Models to Fill in the BlanksCode1
Decoding As Dynamic Programming For Recurrent Autoregressive Models0
Keep Calm and Switch On! Preserving Sentiment and Fluency in Semantic Text ExchangeCode0
Show:102550
← PrevPage 4 of 5Next →

No leaderboard results yet.