SOTAVerified

Cross-Lingual Transfer

Cross-lingual transfer refers to transfer learning using data and models available for one language for which ample such resources are available (e.g., English) to solve tasks in another, commonly more low-resource, language.

Papers

Showing 271280 of 782 papers

TitleStatusHype
Efficiently Adapting Pretrained Language Models To New Languages0
Vicinal Risk Minimization for Few-Shot Cross-lingual Transfer in Abusive Language Detection0
X-SNS: Cross-Lingual Transfer Prediction through Sub-Network Similarity0
Learning Transfers over Several Programming Languages0
A Multi-Modal Multilingual Benchmark for Document Image Classification0
ZGUL: Zero-shot Generalization to Unseen Languages using Multi-source Ensembling of Language AdaptersCode0
Improving Cross-Lingual Transfer through Subtree-Aware Word ReorderingCode0
Investigating Bias in Multilingual Language Models: Cross-Lingual Transfer of Debiasing TechniquesCode0
One For All & All For One: Bypassing Hyperparameter Tuning with Model Averaging For Cross-Lingual TransferCode0
To token or not to token: A Comparative Study of Text Representations for Cross-Lingual TransferCode0
Show:102550
← PrevPage 28 of 79Next →

Benchmark Results

#ModelMetricClaimedVerifiedStatus
1PaLM 2 (few-shot)Accuracy94.4Unverified
2mT0-13BAccuracy84.45Unverified
3RoBERTa Large (translate test)Accuracy76.05Unverified
4BLOOMZAccuracy75.5Unverified
5MAD-X BaseAccuracy60.94Unverified
6mGPTAccuracy55.5Unverified