SOTAVerified

Text Infilling

Text Infilling is the task of predicting missing spans of text which are consistent with the preceding and subsequent text. Text Infilling is a generalization of the cloze task—cloze historically refers to infilling individual words.

Source: Enabling Language Models to Fill in the Blanks

Papers

Showing 2643 of 43 papers

TitleStatusHype
CTRLEval: An Unsupervised Reference-Free Metric for Evaluating Controlled Text GenerationCode1
Language modeling via stochastic processesCode1
Generative Prompt Tuning for Relation Classification0
Conformal prediction for text infilling and part-of-speech predictionCode0
Show Me How To Revise: Improving Lexically Constrained Sentence Generation with XLNetCode0
LOT: A Story-Centric Benchmark for Evaluating Chinese Long Text Understanding and GenerationCode1
Nutri-bullets Hybrid: Consensual Multi-document Summarization0
Nutribullets Hybrid: Multi-document Health SummarizationCode0
InFillmore: Frame-Guided Language Generation with Bidirectional Context0
Improving Sequence-to-Sequence Pre-training via Sequence Span RewritingCode1
Reflective Decoding: Beyond Unidirectional Generation with Off-the-Shelf Language Models0
Back to the Future: Unsupervised Backprop-based Decoding for Counterfactual and Abductive Commonsense ReasoningCode1
Enabling Language Models to Fill in the BlanksCode1
Decoding As Dynamic Programming For Recurrent Autoregressive Models0
Keep Calm and Switch On! Preserving Sentiment and Fluency in Semantic Text ExchangeCode0
"Mask and Infill" : Applying Masked Language Model to Sentiment Transfer0
TIGS: An Inference Algorithm for Text Infilling with Gradient SearchCode0
Text InfillingCode0
Show:102550
← PrevPage 2 of 2Next →

No leaderboard results yet.