SOTAVerified

Cross-Lingual Transfer

Cross-lingual transfer refers to transfer learning using data and models available for one language for which ample such resources are available (e.g., English) to solve tasks in another, commonly more low-resource, language.

Papers

Showing 7180 of 782 papers

TitleStatusHype
Beto, Bentz, Becas: The Surprising Cross-Lingual Effectiveness of BERTCode1
Cross-Lingual Natural Language Generation via Pre-TrainingCode1
Finding Universal Grammatical Relations in Multilingual BERTCode1
From One to Many: Expanding the Scope of Toxicity Mitigation in Language ModelsCode1
GreenPLM: Cross-Lingual Transfer of Monolingual Pre-Trained Language Models at Almost No CostCode1
IndicXNLI: Evaluating Multilingual Inference for Indian LanguagesCode1
InfoXLM: An Information-Theoretic Framework for Cross-Lingual Language Model Pre-TrainingCode1
Investigating Cultural Alignment of Large Language ModelsCode1
BiToD: A Bilingual Multi-Domain Dataset For Task-Oriented Dialogue ModelingCode1
It's All in the Heads: Using Attention Heads as a Baseline for Cross-Lingual Transfer in Commonsense ReasoningCode1
Show:102550
← PrevPage 8 of 79Next →

Benchmark Results

#ModelMetricClaimedVerifiedStatus
1PaLM 2 (few-shot)Accuracy94.4Unverified
2mT0-13BAccuracy84.45Unverified
3RoBERTa Large (translate test)Accuracy76.05Unverified
4BLOOMZAccuracy75.5Unverified
5MAD-X BaseAccuracy60.94Unverified
6mGPTAccuracy55.5Unverified