SOTAVerified

Masked Language Modeling

Papers

Showing 401450 of 475 papers

TitleStatusHype
MeLT: Message-Level Transformer with Masked Document Representations as Pre-Training for Stance DetectionCode0
Split-and-Rephrase in a Cross-Lingual Manner: A Complete Pipeline0
Domain-Specific Japanese ELECTRA Model Using a Small Corpus0
Prompt-Learning for Fine-Grained Entity Typing0
Noobs at Semeval-2021 Task 4: Masked Language Modeling for abstract answer prediction0
Fine-Grained Emotion Prediction by Modeling Emotion DefinitionsCode0
Learning to Sample Replacements for ELECTRA Pre-Training0
Winner Team Mia at TextVQA Challenge 2021: Vision-and-Language Representation Learning with Pre-trained Sequence-to-Sequence Model0
SAS: Self-Augmentation Strategy for Language Model Pre-trainingCode0
MST: Masked Self-Supervised Transformer for Visual Representation0
Exploring Unsupervised Pretraining Objectives for Machine TranslationCode0
BERTnesia: Investigating the capture and forgetting of knowledge in BERTCode0
Bi-Granularity Contrastive Learning for Post-Training in Few-Shot Scene0
Exposing the Implicit Energy Networks behind Masked Language Models via Metropolis--Hastings0
BERT-Defense: A Probabilistic Model Based on BERT to Combat Cognitively Inspired Orthographic Adversarial AttacksCode0
SCRIPT: Self-Critic PreTraining of Transformers0
Target-Aware Data Augmentation for Stance Detection0
MG-BERT: Multi-Graph Augmented BERT for Masked Language Modeling0
From Masked Language Modeling to Translation: Non-English Auxiliary Tasks Improve Zero-shot Spoken Language UnderstandingCode0
Larger-Scale Transformers for Multilingual Masked Language Modeling0
Understanding Chinese Video and Language via Contrastive Multimodal Pre-Training0
On the Influence of Masking Policies in Intermediate Pre-training0
Masked Language Modeling and the Distributional Hypothesis: Order Word Matters Pre-training for Little0
ReCAM@IITK at SemEval-2021 Task 4: BERT and ALBERT based Ensemble for Abstract Word PredictionCode0
UC2: Universal Cross-lingual Cross-modal Vision-and-Language Pre-training0
Pseudo-Label Guided Unsupervised Domain Adaptation of Contextual Embeddings0
Self-supervised Image-text Pre-training With Mixed Data In Chest X-rays0
Variable Name Recovery in Decompiled Binary Code using Constrained Masked Language Modeling0
A Primer on Contrastive Pretraining in Language Processing: Methods, Lessons Learned and Perspectives0
Bilingual Language Modeling, A transfer learning technique for Roman Urdu0
MSA Transformer0
SJ_AJ@DravidianLangTech-EACL2021: Task-Adaptive Pre-Training of Multilingual BERT models for Offensive Language IdentificationCode0
MERMAID: Metaphor Generation with Symbolism and Discriminative Decoding0
Universal Sentence Representations Learning with Conditional Masked Language Model0
Universal Sentence Representation Learning with Conditional Masked Language Model0
Profile Prediction: An Alignment-Based Pre-Training Task for Protein Sequence Models0
XHate-999: Analyzing and Detecting Abusive Language Across Domains and Languages0
Self-Supervised Relationship Probing0
Self-Supervised learning with cross-modal transformers for emotion recognition0
A Hierarchical Multi-Modal Encoder for Moment Localization in Video Corpus0
Controlling the Imprint of Passivization and Negation in Contextualized RepresentationsCode0
POSTECH-ETRI’s Submission to the WMT2020 APE Shared Task: Automatic Post-Editing with Cross-lingual Language Model0
Effective Decoder Masking for Transformer Based End-to-End Speech Recognition0
DICT-MLM: Improved Multilingual Pre-Training using Bilingual Dictionaries0
ST-BERT: Cross-modal Language Model Pre-training For End-to-end Spoken Language Understanding0
Corruption Is Not All Bad: Incorporating Discourse Structure into Pre-training via Corruption for Essay Scoring0
VECO: Variable Encoder-decoder Pre-training for Cross-lingual Understanding and Generation0
Deep Transformers with Latent DepthCode0
GraphCodeBERT: Pre-training Code Representations with Data Flow0
Learning Visual Representations with Caption Annotations0
Show:102550
← PrevPage 9 of 10Next →

No leaderboard results yet.