SOTAVerified

Cross-Lingual Transfer

Cross-lingual transfer refers to transfer learning using data and models available for one language for which ample such resources are available (e.g., English) to solve tasks in another, commonly more low-resource, language.

Papers

Showing 211220 of 782 papers

TitleStatusHype
Distilling Efficient Language-Specific Models for Cross-Lingual TransferCode0
The Effects of Input Type and Pronunciation Dictionary Usage in Transfer Learning for Low-Resource Text-to-Speech0
Improved Cross-Lingual Transfer Learning For Automatic Speech Translation0
SLABERT Talk Pretty One Day: Modeling Second Language Acquisition with BERT0
Pre-Trained Language-Meaning Models for Multilingual Parsing and GenerationCode0
Why Does Zero-Shot Cross-Lingual Generation Fail? An Explanation and a Solution0
Free Lunch: Robust Cross-Lingual Transfer via Model Checkpoint AveragingCode0
Towards a Common Understanding of Contributing Factors for Cross-Lingual Transfer in Multilingual Language Models: A Review0
Revisiting non-English Text Simplification: A Unified Multilingual BenchmarkCode1
Meta-learning For Vision-and-language Cross-lingual Transfer0
Show:102550
← PrevPage 22 of 79Next →

Benchmark Results

#ModelMetricClaimedVerifiedStatus
1PaLM 2 (few-shot)Accuracy94.4Unverified
2mT0-13BAccuracy84.45Unverified
3RoBERTa Large (translate test)Accuracy76.05Unverified
4BLOOMZAccuracy75.5Unverified
5MAD-X BaseAccuracy60.94Unverified
6mGPTAccuracy55.5Unverified