SOTAVerified

Cross-Lingual Transfer

Cross-lingual transfer refers to transfer learning using data and models available for one language for which ample such resources are available (e.g., English) to solve tasks in another, commonly more low-resource, language.

Papers

Showing 501525 of 782 papers

TitleStatusHype
Zero-shot Cross-lingual Transfer is Under-specified Optimization0
xGQA: Cross-Lingual Visual Question Answering0
Subword-based Cross-lingual Transfer of Embeddings from Hindi to Marathi0
Zero-shot Cross-Language Transfer of Monolingual Entity Linking Models0
WECHSEL: Effective initialization of subword embeddings for cross-lingual transfer of monolingual language models0
Unsupervised Cross-Lingual Transfer of Structured Predictors without Source DataCode0
Cross-Language Learning for Entity MatchingCode0
Magic dust for cross-lingual adaptation of monolingual wav2vec-2.00
Using Optimal Transport as Alignment Objective for fine-tuning Multilingual Contextualized Embeddings0
Sequential Reptile: Inter-Task Gradient Alignment for Multilingual Learning0
Analyzing the Effects of Reasoning Types on Cross-Lingual Transfer PerformanceCode0
Universal Recurrent Neural Network Grammar0
Call Larisa Ivanovna: Code-Switching Fools Multilingual NLU ModelsCode0
Learning Invariant Representations on Multilingual Language Models for Unsupervised Cross-Lingual Transfer0
Cross-lingual Intermediate Fine-tuning improves Dialogue State TrackingCode0
Rumour Detection via Zero-shot Cross-lingual Transfer Learning0
Cross-Lingual Language Model Meta-Pretraining0
Simple and Effective Zero-shot Cross-lingual Phoneme RecognitionCode0
Multilingual Generative Language Models for Zero-Shot Cross-Lingual Event Argument Extraction0
On the Universality of Deep Contextual Language Models0
Cross-lingual Transfer of Monolingual Models0
Improving Zero-shot Cross-lingual Transfer between Closely Related Languages by injecting Character-level Noise0
A Simple and Effective Method To Eliminate the Self Language Bias in Multilingual RepresentationsCode0
AfroMT: Pretraining Strategies and Reproducible Benchmarks for Translation of 8 African LanguagesCode0
On the ability of monolingual models to learn language-agnostic representations0
Show:102550
← PrevPage 21 of 32Next →

Benchmark Results

#ModelMetricClaimedVerifiedStatus
1PaLM 2 (few-shot)Accuracy94.4Unverified
2mT0-13BAccuracy84.45Unverified
3RoBERTa Large (translate test)Accuracy76.05Unverified
4BLOOMZAccuracy75.5Unverified
5MAD-X BaseAccuracy60.94Unverified
6mGPTAccuracy55.5Unverified