SOTAVerified

Cross-Lingual Transfer

Cross-lingual transfer refers to transfer learning using data and models available for one language for which ample such resources are available (e.g., English) to solve tasks in another, commonly more low-resource, language.

Papers

Showing 5160 of 782 papers

TitleStatusHype
No Culture Left Behind: ArtELingo-28, a Benchmark of WikiArt with Captions in 28 LanguagesCode0
Code-Switching Curriculum Learning for Multilingual Transfer in LLMs0
Building Dialogue Understanding Models for Low-resource Language Indonesian from Scratch0
Cross-lingual Transfer of Reward Models in Multilingual AlignmentCode0
Exploring Pretraining via Active Forgetting for Improving Cross Lingual Transfer for Decoder Language Models0
Parameter-efficient Adaptation of Multilingual Multimodal Models for Low-resource ASR0
Scaling Laws for Multilingual Language Models0
Cross-lingual Transfer for Automatic Question Generation by Learning Interrogative Structures in Target Languages0
Table Question Answering for Low-resourced Indic LanguagesCode0
Layer Swapping for Zero-Shot Cross-Lingual Transfer in Large Language Models0
Show:102550
← PrevPage 6 of 79Next →

Benchmark Results

#ModelMetricClaimedVerifiedStatus
1PaLM 2 (few-shot)Accuracy94.4Unverified
2mT0-13BAccuracy84.45Unverified
3RoBERTa Large (translate test)Accuracy76.05Unverified
4BLOOMZAccuracy75.5Unverified
5MAD-X BaseAccuracy60.94Unverified
6mGPTAccuracy55.5Unverified