SOTAVerified

XLM-R

XLM-R

Papers

Showing 211220 of 221 papers

TitleStatusHype
FinEst BERT and CroSloEngual BERT: less is more in multilingual models0
English Intermediate-Task Training Improves Zero-Shot Cross-Lingual Transfer Too0
English Intermediate-Task Training Improves Zero-Shot Cross-Lingual Transfer Too0
BERTweet: A pre-trained language model for English TweetsCode1
From Zero to Hero: On the Limitations of Zero-Shot Cross-Lingual Transfer with Multilingual Transformers0
MAD-X: An Adapter-Based Framework for Multi-Task Cross-Lingual TransferCode2
Don't Use English Dev: On the Zero-Shot Cross-Lingual Evaluation of Contextual Embeddings0
Testing pre-trained Transformer models for Lithuanian news clustering0
XGLUE: A New Benchmark Dataset for Cross-lingual Pre-training, Understanding and GenerationCode1
PhoBERT: Pre-trained language models for VietnameseCode1
Show:102550
← PrevPage 22 of 23Next →

No leaderboard results yet.