SOTAVerified

Cross-Lingual Transfer

Cross-lingual transfer refers to transfer learning using data and models available for one language for which ample such resources are available (e.g., English) to solve tasks in another, commonly more low-resource, language.

Papers

Showing 91100 of 782 papers

TitleStatusHype
Improving Zero-Shot Cross-Lingual Transfer via Progressive Code-Switching0
Probing the Emergence of Cross-lingual Alignment during LLM Training0
Self-Distillation for Model Stacking Unlocks Cross-Lingual NLU in 200+ Languages0
News Without Borders: Domain Adaptation of Multilingual Sentence Embeddings for Cross-lingual News RecommendationCode0
How Can We Effectively Expand the Vocabulary of LLMs with 0.01GB of Target Language Text?Code0
UniBridge: A Unified Approach to Cross-Lingual Transfer Learning for Low-Resource LanguagesCode0
ThaiCoref: Thai Coreference Resolution DatasetCode0
What Drives Performance in Multilingual Language Models?Code0
Unknown Script: Impact of Script on Cross-Lingual TransferCode0
Comparing LLM prompting with Cross-lingual transfer performance on Indigenous and Low-resource Brazilian Languages0
Show:102550
← PrevPage 10 of 79Next →

Benchmark Results

#ModelMetricClaimedVerifiedStatus
1PaLM 2 (few-shot)Accuracy94.4Unverified
2mT0-13BAccuracy84.45Unverified
3RoBERTa Large (translate test)Accuracy76.05Unverified
4BLOOMZAccuracy75.5Unverified
5MAD-X BaseAccuracy60.94Unverified
6mGPTAccuracy55.5Unverified