SOTAVerified

Text Infilling

Text Infilling is the task of predicting missing spans of text which are consistent with the preceding and subsequent text. Text Infilling is a generalization of the cloze task—cloze historically refers to infilling individual words.

Source: Enabling Language Models to Fill in the Blanks

Papers

Showing 143 of 43 papers

TitleStatusHype
LaViDa: A Large Diffusion Language Model for Multimodal UnderstandingCode3
The Devil behind the mask: An emergent safety vulnerability of Diffusion LLMsCode2
Back to the Future: Unsupervised Backprop-based Decoding for Counterfactual and Abductive Commonsense ReasoningCode1
Generative Prompt Tuning for Relation ClassificationCode1
Having Beer after Prayer? Measuring Cultural Bias in Large Language ModelsCode1
Improving Sequence-to-Sequence Pre-training via Sequence Span RewritingCode1
Reprogramming Pretrained Language Models for Antibody Sequence InfillingCode1
Prompting ELECTRA: Few-Shot Learning with Discriminative Pre-Trained ModelsCode1
Language modeling via stochastic processesCode1
LOT: A Story-Centric Benchmark for Evaluating Chinese Long Text Understanding and GenerationCode1
MAGVLT: Masked Generative Vision-and-Language TransformerCode1
A Simple yet Effective Framework for Few-Shot Aspect-Based Sentiment AnalysisCode1
Enabling Language Models to Fill in the BlanksCode1
CTRLEval: An Unsupervised Reference-Free Metric for Evaluating Controlled Text GenerationCode1
On the Role of Bidirectionality in Language Model Pre-Training0
A Benchmark for Text Expansion: Datasets, Metrics, and Baselines0
A-TIP: Attribute-aware Text Infilling via Pre-trained Language Model0
BiTIIMT: A Bilingual Text-infilling Method for Interactive Machine Translation0
Building a Knowledge-Based Dialogue System with Text Infilling0
Coordination Generation via Synchronized Text-Infilling0
Decoding As Dynamic Programming For Recurrent Autoregressive Models0
Don't Prompt, Search! Mining-based Zero-Shot Learning with Language Models0
Enhancing Spoken Discourse Modeling in Language Models Using Gestural Cues0
Flexible-length Text Infilling for Discrete Diffusion Models0
Generative Prompt Tuning for Relation Classification0
InFillmore: Frame-Guided Language Generation with Bidirectional Context0
Insertion Language Models: Sequence Generation with Arbitrary-Position Insertions0
"Mask and Infill" : Applying Masked Language Model to Sentiment Transfer0
Nutri-bullets Hybrid: Consensual Multi-document Summarization0
Predicting scalar diversity with context-driven uncertainty over alternatives0
Reflective Decoding: Beyond Unidirectional Generation with Off-the-Shelf Language Models0
Sequence-to-Sequence Pre-training with Unified Modality Masking for Visual Document Understanding0
TIGS: An Inference Algorithm for Text Infilling with Gradient SearchCode0
Text InfillingCode0
Towards Probabilistically-Sound Beam Search with Masked Language ModelsCode0
Keep Calm and Switch On! Preserving Sentiment and Fluency in Semantic Text ExchangeCode0
Conformal prediction for text infilling and part-of-speech predictionCode0
Empowering Character-level Text Infilling by Eliminating Sub-TokensCode0
TrajGPT: Controlled Synthetic Trajectory Generation Using a Multitask Transformer-Based Spatiotemporal ModelCode0
MetaFill: Text Infilling for Meta-Path Generation on Heterogeneous Information NetworksCode0
Model-tuning Via Prompts Makes NLP Models Adversarially RobustCode0
Show Me How To Revise: Improving Lexically Constrained Sentence Generation with XLNetCode0
Nutribullets Hybrid: Multi-document Health SummarizationCode0
Show:102550

No leaderboard results yet.