SOTAVerified

Cross-Lingual Transfer

Cross-lingual transfer refers to transfer learning using data and models available for one language for which ample such resources are available (e.g., English) to solve tasks in another, commonly more low-resource, language.

Papers

Showing 571580 of 782 papers

TitleStatusHype
Task-Specific Pre-Training and Cross Lingual Transfer for Sentiment Analysis in Dravidian Code-Switched Languages0
Meta-Learning with MAML on Trees0
Zero-Shot Cross-Lingual Dependency Parsing through Contextual Embedding TransformationCode0
Task-Specific Pre-Training and Cross Lingual Transfer for Code-Switched Data0
Bilingual Language Modeling, A transfer learning technique for Roman Urdu0
RUBERT: A Bilingual Roman Urdu BERT Using Cross Lingual Transfer Learning0
Beyond the English Web: Zero-Shot Cross-Lingual and Lightweight Monolingual Classification of RegistersCode0
PPT: Parsimonious Parser Transfer for Unsupervised Cross-Lingual AdaptationCode0
First Align, then Predict: Understanding the Cross-Lingual Ability of Multilingual BERTCode0
Analyzing Zero-shot Cross-lingual Transfer in Supervised NLP Tasks0
Show:102550
← PrevPage 58 of 79Next →

Benchmark Results

#ModelMetricClaimedVerifiedStatus
1PaLM 2 (few-shot)Accuracy94.4Unverified
2mT0-13BAccuracy84.45Unverified
3RoBERTa Large (translate test)Accuracy76.05Unverified
4BLOOMZAccuracy75.5Unverified
5MAD-X BaseAccuracy60.94Unverified
6mGPTAccuracy55.5Unverified