SOTAVerified

Cross-Lingual Transfer

Cross-lingual transfer refers to transfer learning using data and models available for one language for which ample such resources are available (e.g., English) to solve tasks in another, commonly more low-resource, language.

Papers

Showing 121130 of 782 papers

TitleStatusHype
Can Machine Translation Bridge Multilingual Pretraining and Cross-lingual Transfer Learning?0
Towards Knowledge-Grounded Natural Language Understanding and Generation0
Cross-Lingual Transfer for Natural Language Inference via Multilingual Prompt Translator0
Few-Shot Cross-Lingual Transfer for Prompting Large Language Models in Low-Resource Languages0
Cross-lingual Transfer or Machine Translation? On Data Augmentation for Monolingual Semantic Textual Similarity0
Tracing the Roots of Facts in Multilingual Language Models: Independent, Shared, and Transferred KnowledgeCode0
DA-Net: A Disentangled and Adaptive Network for Multi-Source Cross-Lingual Transfer Learning0
IRCoder: Intermediate Representations Make Language Models Robust Multilingual Code GeneratorsCode1
From One to Many: Expanding the Scope of Toxicity Mitigation in Language ModelsCode1
Cross-Lingual Learning vs. Low-Resource Fine-Tuning: A Case Study with Fact-Checking in TurkishCode0
Show:102550
← PrevPage 13 of 79Next →

Benchmark Results

#ModelMetricClaimedVerifiedStatus
1PaLM 2 (few-shot)Accuracy94.4Unverified
2mT0-13BAccuracy84.45Unverified
3RoBERTa Large (translate test)Accuracy76.05Unverified
4BLOOMZAccuracy75.5Unverified
5MAD-X BaseAccuracy60.94Unverified
6mGPTAccuracy55.5Unverified