SOTAVerified

Cross-Lingual Transfer

Cross-lingual transfer refers to transfer learning using data and models available for one language for which ample such resources are available (e.g., English) to solve tasks in another, commonly more low-resource, language.

Papers

Showing 101110 of 782 papers

TitleStatusHype
Parameter Space Factorization for Zero-Shot Learning across Tasks and LanguagesCode1
Pre-training Data Quality and Quantity for a Low-Resource Language: New Corpus and BERT Models for MalteseCode1
Code-Mixing on Sesame Street: Dawn of the Adversarial PolyglotsCode1
ALIGN-MLM: Word Embedding Alignment is Crucial for Multilingual Pre-trainingCode1
ColBERT-XM: A Modular Multi-Vector Representation Model for Zero-Shot Multilingual Information RetrievalCode1
Summarising Historical Text in Modern LanguagesCode1
Learning Compact Metrics for MTCode1
Multilingual Multimodal Pre-training for Zero-Shot Cross-Lingual Transfer of Vision-Language ModelsCode1
The Geometry of Multilingual Language Model RepresentationsCode1
XTREME-R: Towards More Challenging and Nuanced Multilingual EvaluationCode1
Show:102550
← PrevPage 11 of 79Next →

Benchmark Results

#ModelMetricClaimedVerifiedStatus
1PaLM 2 (few-shot)Accuracy94.4Unverified
2mT0-13BAccuracy84.45Unverified
3RoBERTa Large (translate test)Accuracy76.05Unverified
4BLOOMZAccuracy75.5Unverified
5MAD-X BaseAccuracy60.94Unverified
6mGPTAccuracy55.5Unverified