SOTAVerified

Cross-Lingual Transfer

Cross-lingual transfer refers to transfer learning using data and models available for one language for which ample such resources are available (e.g., English) to solve tasks in another, commonly more low-resource, language.

Papers

Showing 111120 of 782 papers

TitleStatusHype
Unsupervised Cross-lingual Representation Learning at ScaleCode1
Cross-Lingual Natural Language Generation via Pre-TrainingCode1
Choosing Transfer Languages for Cross-Lingual LearningCode1
Beto, Bentz, Becas: The Surprising Cross-Lingual Effectiveness of BERTCode1
Massively Multilingual Sentence Embeddings for Zero-Shot Cross-Lingual Transfer and BeyondCode1
Enhancing Cross-task Transfer of Large Language Models via Activation Steering0
HanjaBridge: Resolving Semantic Ambiguity in Korean LLMs via Hanja-Augmented Pre-Training0
Cross-Lingual Transfer of Cultural Knowledge: An Asymmetric Phenomenon0
LLMs Are Globally Multilingual Yet Locally Monolingual: Exploring Knowledge Transfer via Language and Thought Theory0
Multilinguality Does not Make Sense: Investigating Factors Behind Zero-Shot Transfer in Sense-Aware Tasks0
Show:102550
← PrevPage 12 of 79Next →

Benchmark Results

#ModelMetricClaimedVerifiedStatus
1PaLM 2 (few-shot)Accuracy94.4Unverified
2mT0-13BAccuracy84.45Unverified
3RoBERTa Large (translate test)Accuracy76.05Unverified
4BLOOMZAccuracy75.5Unverified
5MAD-X BaseAccuracy60.94Unverified
6mGPTAccuracy55.5Unverified