SOTAVerified

Cross-Lingual Transfer

Cross-lingual transfer refers to transfer learning using data and models available for one language for which ample such resources are available (e.g., English) to solve tasks in another, commonly more low-resource, language.

Papers

Showing 526550 of 782 papers

TitleStatusHype
Improving Zero-Shot Cross-Lingual Transfer Learning via Robust TrainingCode0
"Wikily" Supervised Neural Translation Tailored to Cross-Lingual TasksCode0
MetaXL: Meta Representation Transformation for Low-resource Cross-lingual LearningCode1
Span Pointer Networks for Non-Autoregressive Task-Oriented Semantic Parsing0
XTREME-R: Towards More Challenging and Nuanced Multilingual EvaluationCode1
Zero-Shot Cross-lingual Semantic ParsingCode1
Cross-Lingual Word Embedding Refinement by _1 Norm OptimisationCode1
Fine-tuning Encoders for Improved Monolingual and Zero-shot Polylingual Neural Topic ModelingCode0
MCL@IITK at SemEval-2021 Task 2: Multilingual and Cross-lingual Word-in-Context Disambiguation using Augmented Data, Signals, and Transformers0
Improving Cross-Lingual Transfer for Event Argument Extraction with Language-Universal Sentence Structures0
To Block or not to Block: Experiments with Machine Learning for News Comment Moderation0
Negation Scope Resolution for Chinese as a Second Language0
Task-Specific Pre-Training and Cross Lingual Transfer for Sentiment Analysis in Dravidian Code-Switched Languages0
Cross-Lingual Transfer Learning for Hate Speech Detection0
GCDH@LT-EDI-EACL2021: XLM-RoBERTa for Hope Speech Detection in English, Malayalam, and Tamil0
Cross-Lingual Transfer with MAML on Trees0
Project-then-Transfer: Effective Two-stage Cross-lingual Transfer for Semantic Dependency Parsing0
Code-Mixing on Sesame Street: Dawn of the Adversarial PolyglotsCode1
Multilingual Multimodal Pre-training for Zero-Shot Cross-Lingual Transfer of Vision-Language ModelsCode1
Multi-view Subword RegularizationCode1
Meta-Learning with MAML on Trees0
Zero-Shot Cross-Lingual Dependency Parsing through Contextual Embedding TransformationCode0
Task-Specific Pre-Training and Cross Lingual Transfer for Code-Switched Data0
RUBERT: A Bilingual Roman Urdu BERT Using Cross Lingual Transfer Learning0
Bilingual Language Modeling, A transfer learning technique for Roman Urdu0
Show:102550
← PrevPage 22 of 32Next →

Benchmark Results

#ModelMetricClaimedVerifiedStatus
1PaLM 2 (few-shot)Accuracy94.4Unverified
2mT0-13BAccuracy84.45Unverified
3RoBERTa Large (translate test)Accuracy76.05Unverified
4BLOOMZAccuracy75.5Unverified
5MAD-X BaseAccuracy60.94Unverified
6mGPTAccuracy55.5Unverified