SOTAVerified

Text Infilling

Text Infilling is the task of predicting missing spans of text which are consistent with the preceding and subsequent text. Text Infilling is a generalization of the cloze task—cloze historically refers to infilling individual words.

Source: Enabling Language Models to Fill in the Blanks

Papers

Showing 125 of 43 papers

TitleStatusHype
The Devil behind the mask: An emergent safety vulnerability of Diffusion LLMsCode2
Flexible-length Text Infilling for Discrete Diffusion Models0
LaViDa: A Large Diffusion Language Model for Multimodal UnderstandingCode3
Insertion Language Models: Sequence Generation with Arbitrary-Position Insertions0
Enhancing Spoken Discourse Modeling in Language Models Using Gestural Cues0
TrajGPT: Controlled Synthetic Trajectory Generation Using a Multitask Transformer-Based Spatiotemporal ModelCode0
Empowering Character-level Text Infilling by Eliminating Sub-TokensCode0
Towards Probabilistically-Sound Beam Search with Masked Language ModelsCode0
A Benchmark for Text Expansion: Datasets, Metrics, and Baselines0
A Simple yet Effective Framework for Few-Shot Aspect-Based Sentiment AnalysisCode1
Having Beer after Prayer? Measuring Cultural Bias in Large Language ModelsCode1
Sequence-to-Sequence Pre-training with Unified Modality Masking for Visual Document Understanding0
MAGVLT: Masked Generative Vision-and-Language TransformerCode1
Model-tuning Via Prompts Makes NLP Models Adversarially RobustCode0
Don't Prompt, Search! Mining-based Zero-Shot Learning with Language Models0
Generative Prompt Tuning for Relation ClassificationCode1
MetaFill: Text Infilling for Meta-Path Generation on Heterogeneous Information NetworksCode0
Reprogramming Pretrained Language Models for Antibody Sequence InfillingCode1
Coordination Generation via Synchronized Text-Infilling0
A-TIP: Attribute-aware Text Infilling via Pre-trained Language Model0
Building a Knowledge-Based Dialogue System with Text Infilling0
Prompting ELECTRA: Few-Shot Learning with Discriminative Pre-Trained ModelsCode1
On the Role of Bidirectionality in Language Model Pre-Training0
BiTIIMT: A Bilingual Text-infilling Method for Interactive Machine Translation0
Predicting scalar diversity with context-driven uncertainty over alternatives0
Show:102550
← PrevPage 1 of 2Next →

No leaderboard results yet.