SOTAVerified

Cross-Lingual Transfer

Cross-lingual transfer refers to transfer learning using data and models available for one language for which ample such resources are available (e.g., English) to solve tasks in another, commonly more low-resource, language.

Papers

Showing 391400 of 782 papers

TitleStatusHype
Frustratingly Simple Regularization to Improve Zero-shot Cross-lingual Robustness0
A Balanced Data Approach for Evaluating Cross-Lingual Transfer: Mapping the Linguistic Blood Bank0
One Step Is Enough for Few-Shot Cross-Lingual Transfer: Co-Training with Gradient Optimization0
Surprisingly Simple Adapter Ensembling for Zero-Shot Cross-Lingual Sequence Tagging0
Orthogonal Language and Task Adapters in Zero-Shot Cross-Lingual Transfer0
Multi2WOZ: A Robust Multilingual Dataset and Conversational Pretraining for Task-Oriented Dialog0
Don’t Forget Cheap Training Signals Before Building Unsupervised Bilingual Word Embeddings0
When More is not Necessary Better: Multilingual Auxiliary Tasks for Zero-Shot Cross-Lingual Transfer of Hate Speech Detection Models0
BAD-X: Bilingual Adapters Improve Zero-Shot Cross-Lingual Transfer0
Realistic Zero-Shot Cross-Lingual Transfer in Legal Topic Classification0
Show:102550
← PrevPage 40 of 79Next →

Benchmark Results

#ModelMetricClaimedVerifiedStatus
1PaLM 2 (few-shot)Accuracy94.4Unverified
2mT0-13BAccuracy84.45Unverified
3RoBERTa Large (translate test)Accuracy76.05Unverified
4BLOOMZAccuracy75.5Unverified
5MAD-X BaseAccuracy60.94Unverified
6mGPTAccuracy55.5Unverified