SOTAVerified

Pretrained Multilingual Language Models

Papers

Showing 110 of 26 papers

TitleStatusHype
Improving Word Translation via Two-Stage Contrastive LearningCode1
How Good is Your Tokenizer? On the Monolingual Performance of Multilingual Language ModelsCode1
Small Data? No Problem! Exploring the Viability of Pretrained Multilingual Language Models for Low-resourced LanguagesCode1
Improving Word Translation via Two-Stage Contrastive LearningCode1
Improving Bilingual Lexicon Induction with Cross-Encoder RerankingCode1
Probing Cross-Lingual Lexical Knowledge from Multilingual Sentence Encoders0
A Primer on Pretrained Multilingual Language Models0
Empirical study of pretrained multilingual language models for zero-shot cross-lingual knowledge transfer in generation0
Improving Non-autoregressive Translation Quality with Pretrained Language Model, Embedding Distillation and Upsampling Strategy for CTC0
Are Pretrained Multilingual Models Equally Fair Across Languages?0
Show:102550
← PrevPage 1 of 3Next →

No leaderboard results yet.