SOTAVerified

Cross-Lingual Transfer

Cross-lingual transfer refers to transfer learning using data and models available for one language for which ample such resources are available (e.g., English) to solve tasks in another, commonly more low-resource, language.

Papers

Showing 411420 of 782 papers

TitleStatusHype
Self-Distillation for Model Stacking Unlocks Cross-Lingual NLU in 200+ Languages0
Self-Supervised Representations Improve End-to-End Speech Translation0
Self-Translate-Train: Enhancing Cross-Lingual Transfer of Large Language Models via Inherent Capability0
Semantic Pivots Enable Cross-Lingual Transfer in Large Language Models0
SenWiCh: Sense-Annotation of Low-Resource Languages for WiC using Hybrid Methods0
Sequence Mixup for Zero-Shot Cross-Lingual Part-Of-Speech Tagging0
Sequential Reptile: Inter-Task Gradient Alignment for Multilingual Learning0
Sharing, Teaching and Aligning: Knowledgeable Transfer Learning for Cross-Lingual Machine Reading Comprehension0
SIGTYP 2020 Shared Task: Prediction of Typological Features0
SLABERT Talk Pretty One Day: Modeling Second Language Acquisition with BERT0
Show:102550
← PrevPage 42 of 79Next →

Benchmark Results

#ModelMetricClaimedVerifiedStatus
1PaLM 2 (few-shot)Accuracy94.4Unverified
2mT0-13BAccuracy84.45Unverified
3RoBERTa Large (translate test)Accuracy76.05Unverified
4BLOOMZAccuracy75.5Unverified
5MAD-X BaseAccuracy60.94Unverified
6mGPTAccuracy55.5Unverified