SOTAVerified

Cross-Lingual Transfer

Cross-lingual transfer refers to transfer learning using data and models available for one language for which ample such resources are available (e.g., English) to solve tasks in another, commonly more low-resource, language.

Papers

Showing 501525 of 782 papers

TitleStatusHype
Probing Multilingual Language Models for Discourse0
Investigating Transfer Learning in Multilingual Pre-trained Language Models through Chinese Natural Language InferenceCode1
BiToD: A Bilingual Multi-Domain Dataset For Task-Oriented Dialogue ModelingCode1
MergeDistill: Merging Pre-trained Language Models using Distillation0
Language Scaling for Universal Suggested Replies Model0
Bilingual Alignment Pre-Training for Zero-Shot Cross-Lingual TransferCode0
Syntax-augmented Multilingual BERT for Cross-lingual TransferCode1
Language Embeddings for Typology and Cross-lingual Transfer LearningCode0
How to Adapt Your Pretrained Multilingual Model to 1600 Languages0
ZmBART: An Unsupervised Cross-lingual Transfer Framework for Language GenerationCode1
Improving Zero-Shot Cross-lingual Transfer for Multilingual Question Answering over Knowledge Graph0
Cross-Lingual Word Embedding Refinement by _1 Norm OptimisationCode1
Learning Domain-Specialised Representations for Cross-Lingual Biomedical Entity LinkingCode1
Towards More Equitable Question Answering Systems: How Much More Data Do You Need?Code0
DaN+: Danish Nested Named Entities and Lexical NormalizationCode0
Multilingual Domain Adaptation for NMT: Decoupling Language and Domain Information with Adapters0
The interplay between language similarity and script on a novel multi-layer Algerian dialect corpusCode0
A cost-benefit analysis of cross-lingual transfer methodsCode1
Analysing The Impact Of Linguistic Features On Cross-Lingual TransferCode0
Assessing the Syntactic Capabilities of Transformer-based Multilingual Language Models0
Multilingual and Zero-Shot is Closing in on Monolingual Web Register Classification0
X-METRA-ADA: Cross-lingual Meta-Transfer Learning Adaptation to Natural Language Understanding and Question AnsweringCode1
Zero-shot Cross-lingual Transfer of Neural Machine Translation with Multilingual Pretrained EncodersCode0
AmericasNLI: Evaluating Zero-shot Natural Language Understanding of Pretrained Multilingual Models in Truly Low-resource LanguagesCode0
Crossing the Conversational Chasm: A Primer on Natural Language Processing for Multilingual Task-Oriented Dialogue Systems0
Show:102550
← PrevPage 21 of 32Next →

Benchmark Results

#ModelMetricClaimedVerifiedStatus
1PaLM 2 (few-shot)Accuracy94.4Unverified
2mT0-13BAccuracy84.45Unverified
3RoBERTa Large (translate test)Accuracy76.05Unverified
4BLOOMZAccuracy75.5Unverified
5MAD-X BaseAccuracy60.94Unverified
6mGPTAccuracy55.5Unverified