SOTAVerified

Cross-Lingual Transfer

Cross-lingual transfer refers to transfer learning using data and models available for one language for which ample such resources are available (e.g., English) to solve tasks in another, commonly more low-resource, language.

Papers

Showing 451460 of 782 papers

TitleStatusHype
mLUKE: The Power of Entity Representations in Multilingual Pretrained Language ModelsCode1
Composable Sparse Fine-Tuning for Cross-Lingual TransferCode1
Learning Compact Metrics for MTCode1
K-Wav2vec 2.0: Automatic Speech Recognition based on Joint Decoding of Graphemes and SyllablesCode1
Unsupervised Cross-Lingual Transfer of Structured Predictors without Source DataCode0
Magic dust for cross-lingual adaptation of monolingual wav2vec-2.00
Cross-Language Learning for Entity MatchingCode0
Sequential Reptile: Inter-Task Gradient Alignment for Multilingual Learning0
Using Optimal Transport as Alignment Objective for fine-tuning Multilingual Contextualized Embeddings0
Analyzing the Effects of Reasoning Types on Cross-Lingual Transfer PerformanceCode0
Show:102550
← PrevPage 46 of 79Next →

Benchmark Results

#ModelMetricClaimedVerifiedStatus
1PaLM 2 (few-shot)Accuracy94.4Unverified
2mT0-13BAccuracy84.45Unverified
3RoBERTa Large (translate test)Accuracy76.05Unverified
4BLOOMZAccuracy75.5Unverified
5MAD-X BaseAccuracy60.94Unverified
6mGPTAccuracy55.5Unverified