SOTAVerified

Cross-Lingual Transfer

Cross-lingual transfer refers to transfer learning using data and models available for one language for which ample such resources are available (e.g., English) to solve tasks in another, commonly more low-resource, language.

Papers

Showing 501510 of 782 papers

TitleStatusHype
Probing Multilingual Language Models for Discourse0
Investigating Transfer Learning in Multilingual Pre-trained Language Models through Chinese Natural Language InferenceCode1
BiToD: A Bilingual Multi-Domain Dataset For Task-Oriented Dialogue ModelingCode1
MergeDistill: Merging Pre-trained Language Models using Distillation0
Language Scaling for Universal Suggested Replies Model0
Bilingual Alignment Pre-Training for Zero-Shot Cross-Lingual TransferCode0
Syntax-augmented Multilingual BERT for Cross-lingual TransferCode1
Language Embeddings for Typology and Cross-lingual Transfer LearningCode0
How to Adapt Your Pretrained Multilingual Model to 1600 Languages0
ZmBART: An Unsupervised Cross-lingual Transfer Framework for Language GenerationCode1
Show:102550
← PrevPage 51 of 79Next →

Benchmark Results

#ModelMetricClaimedVerifiedStatus
1PaLM 2 (few-shot)Accuracy94.4Unverified
2mT0-13BAccuracy84.45Unverified
3RoBERTa Large (translate test)Accuracy76.05Unverified
4BLOOMZAccuracy75.5Unverified
5MAD-X BaseAccuracy60.94Unverified
6mGPTAccuracy55.5Unverified