SOTAVerified

Cross-Lingual Transfer

Cross-lingual transfer refers to transfer learning using data and models available for one language for which ample such resources are available (e.g., English) to solve tasks in another, commonly more low-resource, language.

Papers

Showing 761770 of 782 papers

TitleStatusHype
Don't Use English Dev: On the Zero-Shot Cross-Lingual Evaluation of Contextual Embeddings0
On the Role of Parallel Data in Cross-lingual Transfer Learning0
On the Universality of Deep Contextual Language Models0
On the Usability of Transformers-based models for a French Question-Answering task0
On Zero-shot Cross-lingual Transfer of Multilingual Neural Machine Translation0
Optimizing Two-Pass Cross-Lingual Transfer Learning: Phoneme Recognition and Phoneme to Grapheme Translation0
Orthogonal Language and Task Adapters in Zero-Shot Cross-Lingual Transfer0
Orthogonal Language and Task Adapters in Zero-Shot Cross-Lingual Transfer0
Overcoming Vocabulary Constraints with Pixel-level Fallback0
Parameter-efficient Adaptation of Multilingual Multimodal Models for Low-resource ASR0
Show:102550
← PrevPage 77 of 79Next →

Benchmark Results

#ModelMetricClaimedVerifiedStatus
1PaLM 2 (few-shot)Accuracy94.4Unverified
2mT0-13BAccuracy84.45Unverified
3RoBERTa Large (translate test)Accuracy76.05Unverified
4BLOOMZAccuracy75.5Unverified
5MAD-X BaseAccuracy60.94Unverified
6mGPTAccuracy55.5Unverified