SOTAVerified

Masked Language Modeling

Papers

Showing 351375 of 475 papers

TitleStatusHype
Predicting Attention Sparsity in Transformers0
MeLT: Message-Level Transformer with Masked Document Representations as Pre-Training for Stance DetectionCode0
SupCL-Seq: Supervised Contrastive Learning for Downstream Optimized Sequence RepresentationsCode1
CPT: A Pre-Trained Unbalanced Transformer for Both Chinese Language Understanding and GenerationCode1
Data Efficient Masked Language Modeling for Vision and LanguageCode1
Frustratingly Simple Pretraining Alternatives to Masked Language ModelingCode1
Split-and-Rephrase in a Cross-Lingual Manner: A Complete Pipeline0
Domain-Specific Japanese ELECTRA Model Using a Small Corpus0
CTAL: Pre-training Cross-modal Transformer for Audio-and-Language RepresentationsCode1
Sentence Bottleneck Autoencoders from Transformer Language ModelsCode1
MELM: Data Augmentation with Masked Entity Language Modeling for Low-Resource NERCode1
Prompt-Learning for Fine-Grained Entity Typing0
Knowledge Perceived Multi-modal Pretraining in E-commerceCode1
W2v-BERT: Combining Contrastive Learning and Masked Language Modeling for Self-Supervised Speech Pre-TrainingCode3
Knowledgeable Prompt-tuning: Incorporating Knowledge into Prompt Verbalizer for Text ClassificationCode1
Noobs at Semeval-2021 Task 4: Masked Language Modeling for abstract answer prediction0
Fine-Grained Emotion Prediction by Modeling Emotion DefinitionsCode0
Learning to Sample Replacements for ELECTRA Pre-Training0
Winner Team Mia at TextVQA Challenge 2021: Vision-and-Language Representation Learning with Pre-trained Sequence-to-Sequence Model0
SPBERT: An Efficient Pre-training BERT on SPARQL Queries for Question Answering over Knowledge GraphsCode1
SAS: Self-Augmentation Strategy for Language Model Pre-trainingCode0
Improving Pretrained Cross-Lingual Language Models via Self-Labeled Word AlignmentCode1
Exploring Unsupervised Pretraining Objectives for Machine TranslationCode0
MST: Masked Self-Supervised Transformer for Visual Representation0
BERTnesia: Investigating the capture and forgetting of knowledge in BERTCode0
Show:102550
← PrevPage 15 of 19Next →

No leaderboard results yet.