SOTAVerified

Cross-Lingual Transfer

Cross-lingual transfer refers to transfer learning using data and models available for one language for which ample such resources are available (e.g., English) to solve tasks in another, commonly more low-resource, language.

Papers

Showing 131140 of 782 papers

TitleStatusHype
AdaMergeX: Cross-Lingual Transfer with Large Language Models via Adaptive Adapter MergingCode1
Exploring Multilingual Concepts of Human Value in Large Language Models: Is Value Alignment Consistent, Transferable and Controllable across Languages?Code0
Emotion Classification in Low and Moderate Resource Languages0
Towards Explainability and Fairness in Swiss Judgement Prediction: Benchmarking on a Multilingual Dataset0
ColBERT-XM: A Modular Multi-Vector Representation Model for Zero-Shot Multilingual Information RetrievalCode1
Mitigating the Linguistic Gap with Phonemic Representations for Robust Cross-lingual Transfer0
Zero-shot cross-lingual transfer in instruction tuning of large language models0
Analysis of Multi-Source Language Training in Cross-Lingual Transfer0
Investigating Cultural Alignment of Large Language ModelsCode1
Key ingredients for effective zero-shot cross-lingual knowledge transfer in generative tasks0
Show:102550
← PrevPage 14 of 79Next →

Benchmark Results

#ModelMetricClaimedVerifiedStatus
1PaLM 2 (few-shot)Accuracy94.4Unverified
2mT0-13BAccuracy84.45Unverified
3RoBERTa Large (translate test)Accuracy76.05Unverified
4BLOOMZAccuracy75.5Unverified
5MAD-X BaseAccuracy60.94Unverified
6mGPTAccuracy55.5Unverified