SOTAVerified

Text Infilling

Text Infilling is the task of predicting missing spans of text which are consistent with the preceding and subsequent text. Text Infilling is a generalization of the cloze task—cloze historically refers to infilling individual words.

Source: Enabling Language Models to Fill in the Blanks

Papers

Showing 2643 of 43 papers

TitleStatusHype
Insertion Language Models: Sequence Generation with Arbitrary-Position Insertions0
A Benchmark for Text Expansion: Datasets, Metrics, and Baselines0
A-TIP: Attribute-aware Text Infilling via Pre-trained Language Model0
BiTIIMT: A Bilingual Text-infilling Method for Interactive Machine Translation0
Building a Knowledge-Based Dialogue System with Text Infilling0
Coordination Generation via Synchronized Text-Infilling0
Decoding As Dynamic Programming For Recurrent Autoregressive Models0
Don't Prompt, Search! Mining-based Zero-Shot Learning with Language Models0
Enhancing Spoken Discourse Modeling in Language Models Using Gestural Cues0
Flexible-length Text Infilling for Discrete Diffusion Models0
Generative Prompt Tuning for Relation Classification0
InFillmore: Frame-Guided Language Generation with Bidirectional Context0
"Mask and Infill" : Applying Masked Language Model to Sentiment Transfer0
Nutri-bullets Hybrid: Consensual Multi-document Summarization0
On the Role of Bidirectionality in Language Model Pre-Training0
Predicting scalar diversity with context-driven uncertainty over alternatives0
Reflective Decoding: Beyond Unidirectional Generation with Off-the-Shelf Language Models0
Sequence-to-Sequence Pre-training with Unified Modality Masking for Visual Document Understanding0
Show:102550
← PrevPage 2 of 2Next →

No leaderboard results yet.