SOTAVerified

Masked Language Modeling

Papers

Showing 126150 of 475 papers

TitleStatusHype
The Lottery Ticket Hypothesis for Pre-trained BERT NetworksCode1
Language-agnostic BERT Sentence EmbeddingCode1
Pre-training via ParaphrasingCode1
MC-BERT: Efficient Language Pre-Training via a Meta ControllerCode1
Massive Choice, Ample Tasks (MaChAmp): A Toolkit for Multi-task Learning in NLPCode1
HERO: Hierarchical Encoder for Video+Language Omni-representation Pre-trainingCode1
Segatron: Segment-Aware Transformer for Language Modeling and UnderstandingCode1
Train No Evil: Selective Masking for Task-Guided Pre-TrainingCode1
TOD-BERT: Pre-trained Natural Language Understanding for Task-Oriented DialogueCode1
ELECTRA: Pre-training Text Encoders as Discriminators Rather Than GeneratorsCode1
Talking-Heads AttentionCode1
REALM: Retrieval-Augmented Language Model Pre-TrainingCode1
UNITER: UNiversal Image-TExt Representation LearningCode1
LXMERT: Learning Cross-Modality Encoder Representations from TransformersCode1
Mask-Predict: Parallel Decoding of Conditional Masked Language ModelsCode1
GeoRecon: Graph-Level Representation Learning for 3D Molecules via Reconstruction-Based Pretraining0
Masked Language Models are Good Heterogeneous Graph GeneralizersCode0
Improving Low-Resource Morphological Inflection via Self-Supervised Objectives0
HAD: Hybrid Architecture Distillation Outperforms Teacher in Genomic Sequence Modeling0
Ankh3: Multi-Task Pretraining with Sequence Denoising and Completion Enhances Protein Representations0
ADALog: Adaptive Unsupervised Anomaly detection in Logs with Self-attention Masked Language Model0
CodeSSM: Towards State Space Models for Code Understanding0
In-Context Learning can distort the relationship between sequence likelihoods and biological fitness0
Low-Resource Transliteration for Roman-Urdu and Urdu Using Transformer-Based Models0
Enhancing Domain-Specific Encoder Models with LLM-Generated Data: How to Leverage Ontologies, and How to Do Without Them0
Show:102550
← PrevPage 6 of 19Next →

No leaderboard results yet.