SOTAVerified

Cross-Lingual Transfer

Cross-lingual transfer refers to transfer learning using data and models available for one language for which ample such resources are available (e.g., English) to solve tasks in another, commonly more low-resource, language.

Papers

Showing 501550 of 782 papers

TitleStatusHype
Probing Multilingual Language Models for Discourse0
Investigating Transfer Learning in Multilingual Pre-trained Language Models through Chinese Natural Language InferenceCode1
BiToD: A Bilingual Multi-Domain Dataset For Task-Oriented Dialogue ModelingCode1
MergeDistill: Merging Pre-trained Language Models using Distillation0
Language Scaling for Universal Suggested Replies Model0
Bilingual Alignment Pre-Training for Zero-Shot Cross-Lingual TransferCode0
Syntax-augmented Multilingual BERT for Cross-lingual TransferCode1
Language Embeddings for Typology and Cross-lingual Transfer LearningCode0
How to Adapt Your Pretrained Multilingual Model to 1600 Languages0
ZmBART: An Unsupervised Cross-lingual Transfer Framework for Language GenerationCode1
Improving Zero-Shot Cross-lingual Transfer for Multilingual Question Answering over Knowledge Graph0
Cross-Lingual Word Embedding Refinement by _1 Norm OptimisationCode1
Learning Domain-Specialised Representations for Cross-Lingual Biomedical Entity LinkingCode1
Towards More Equitable Question Answering Systems: How Much More Data Do You Need?Code0
DaN+: Danish Nested Named Entities and Lexical NormalizationCode0
Multilingual Domain Adaptation for NMT: Decoupling Language and Domain Information with Adapters0
The interplay between language similarity and script on a novel multi-layer Algerian dialect corpusCode0
A cost-benefit analysis of cross-lingual transfer methodsCode1
Analysing The Impact Of Linguistic Features On Cross-Lingual TransferCode0
Assessing the Syntactic Capabilities of Transformer-based Multilingual Language Models0
Multilingual and Zero-Shot is Closing in on Monolingual Web Register Classification0
X-METRA-ADA: Cross-lingual Meta-Transfer Learning Adaptation to Natural Language Understanding and Question AnsweringCode1
Zero-shot Cross-lingual Transfer of Neural Machine Translation with Multilingual Pretrained EncodersCode0
AmericasNLI: Evaluating Zero-shot Natural Language Understanding of Pretrained Multilingual Models in Truly Low-resource LanguagesCode0
Crossing the Conversational Chasm: A Primer on Natural Language Processing for Multilingual Task-Oriented Dialogue Systems0
Improving Zero-Shot Cross-Lingual Transfer Learning via Robust TrainingCode0
"Wikily" Supervised Neural Translation Tailored to Cross-Lingual TasksCode0
MetaXL: Meta Representation Transformation for Low-resource Cross-lingual LearningCode1
Span Pointer Networks for Non-Autoregressive Task-Oriented Semantic Parsing0
XTREME-R: Towards More Challenging and Nuanced Multilingual EvaluationCode1
Zero-Shot Cross-lingual Semantic ParsingCode1
Cross-Lingual Word Embedding Refinement by _1 Norm OptimisationCode1
Fine-tuning Encoders for Improved Monolingual and Zero-shot Polylingual Neural Topic ModelingCode0
MCL@IITK at SemEval-2021 Task 2: Multilingual and Cross-lingual Word-in-Context Disambiguation using Augmented Data, Signals, and Transformers0
Improving Cross-Lingual Transfer for Event Argument Extraction with Language-Universal Sentence Structures0
To Block or not to Block: Experiments with Machine Learning for News Comment Moderation0
Negation Scope Resolution for Chinese as a Second Language0
Task-Specific Pre-Training and Cross Lingual Transfer for Sentiment Analysis in Dravidian Code-Switched Languages0
Cross-Lingual Transfer Learning for Hate Speech Detection0
GCDH@LT-EDI-EACL2021: XLM-RoBERTa for Hope Speech Detection in English, Malayalam, and Tamil0
Cross-Lingual Transfer with MAML on Trees0
Project-then-Transfer: Effective Two-stage Cross-lingual Transfer for Semantic Dependency Parsing0
Code-Mixing on Sesame Street: Dawn of the Adversarial PolyglotsCode1
Multilingual Multimodal Pre-training for Zero-Shot Cross-Lingual Transfer of Vision-Language ModelsCode1
Multi-view Subword RegularizationCode1
Meta-Learning with MAML on Trees0
Zero-Shot Cross-Lingual Dependency Parsing through Contextual Embedding TransformationCode0
Task-Specific Pre-Training and Cross Lingual Transfer for Code-Switched Data0
RUBERT: A Bilingual Roman Urdu BERT Using Cross Lingual Transfer Learning0
Bilingual Language Modeling, A transfer learning technique for Roman Urdu0
Show:102550
← PrevPage 11 of 16Next →

Benchmark Results

#ModelMetricClaimedVerifiedStatus
1PaLM 2 (few-shot)Accuracy94.4Unverified
2mT0-13BAccuracy84.45Unverified
3RoBERTa Large (translate test)Accuracy76.05Unverified
4BLOOMZAccuracy75.5Unverified
5MAD-X BaseAccuracy60.94Unverified
6mGPTAccuracy55.5Unverified