SOTAVerified

Cross-Lingual Transfer

Cross-lingual transfer refers to transfer learning using data and models available for one language for which ample such resources are available (e.g., English) to solve tasks in another, commonly more low-resource, language.

Papers

Showing 1120 of 782 papers

TitleStatusHype
Multilingual Large Language Models: A Systematic SurveyCode1
From One to Many: Expanding the Scope of Toxicity Mitigation in Language ModelsCode1
IRCoder: Intermediate Representations Make Language Models Robust Multilingual Code GeneratorsCode1
AdaMergeX: Cross-Lingual Transfer with Large Language Models via Adaptive Adapter MergingCode1
ColBERT-XM: A Modular Multi-Vector Representation Model for Zero-Shot Multilingual Information RetrievalCode1
Investigating Cultural Alignment of Large Language ModelsCode1
LEIA: Facilitating Cross-lingual Knowledge Transfer in Language Models with Entity-based Data AugmentationCode1
UltraLink: An Open-Source Knowledge-Enhanced Multilingual Supervised Fine-tuning DatasetCode1
Turning English-centric LLMs Into Polyglots: How Much Multilinguality Is Needed?Code1
TaCo: Enhancing Cross-Lingual Transfer for Low-Resource Languages in LLMs through Translation-Assisted Chain-of-Thought ProcessesCode1
Show:102550
← PrevPage 2 of 79Next →

Benchmark Results

#ModelMetricClaimedVerifiedStatus
1PaLM 2 (few-shot)Accuracy94.4Unverified
2mT0-13BAccuracy84.45Unverified
3RoBERTa Large (translate test)Accuracy76.05Unverified
4BLOOMZAccuracy75.5Unverified
5MAD-X BaseAccuracy60.94Unverified
6mGPTAccuracy55.5Unverified