SOTAVerified

Text Infilling

Text Infilling is the task of predicting missing spans of text which are consistent with the preceding and subsequent text. Text Infilling is a generalization of the cloze task—cloze historically refers to infilling individual words.

Source: Enabling Language Models to Fill in the Blanks

Papers

Showing 110 of 43 papers

TitleStatusHype
LaViDa: A Large Diffusion Language Model for Multimodal UnderstandingCode3
The Devil behind the mask: An emergent safety vulnerability of Diffusion LLMsCode2
Having Beer after Prayer? Measuring Cultural Bias in Large Language ModelsCode1
Back to the Future: Unsupervised Backprop-based Decoding for Counterfactual and Abductive Commonsense ReasoningCode1
A Simple yet Effective Framework for Few-Shot Aspect-Based Sentiment AnalysisCode1
Generative Prompt Tuning for Relation ClassificationCode1
Improving Sequence-to-Sequence Pre-training via Sequence Span RewritingCode1
Enabling Language Models to Fill in the BlanksCode1
CTRLEval: An Unsupervised Reference-Free Metric for Evaluating Controlled Text GenerationCode1
Language modeling via stochastic processesCode1
Show:102550
← PrevPage 1 of 5Next →

No leaderboard results yet.