SOTAVerified

Text Infilling

Text Infilling is the task of predicting missing spans of text which are consistent with the preceding and subsequent text. Text Infilling is a generalization of the cloze task—cloze historically refers to infilling individual words.

Source: Enabling Language Models to Fill in the Blanks

Papers

Showing 1120 of 43 papers

TitleStatusHype
Having Beer after Prayer? Measuring Cultural Bias in Large Language ModelsCode1
Sequence-to-Sequence Pre-training with Unified Modality Masking for Visual Document Understanding0
MAGVLT: Masked Generative Vision-and-Language TransformerCode1
Model-tuning Via Prompts Makes NLP Models Adversarially RobustCode0
Don't Prompt, Search! Mining-based Zero-Shot Learning with Language Models0
Generative Prompt Tuning for Relation ClassificationCode1
MetaFill: Text Infilling for Meta-Path Generation on Heterogeneous Information NetworksCode0
Reprogramming Pretrained Language Models for Antibody Sequence InfillingCode1
A-TIP: Attribute-aware Text Infilling via Pre-trained Language Model0
Coordination Generation via Synchronized Text-Infilling0
Show:102550
← PrevPage 2 of 5Next →

No leaderboard results yet.