SOTAVerified

Cross-Lingual Transfer

Cross-lingual transfer refers to transfer learning using data and models available for one language for which ample such resources are available (e.g., English) to solve tasks in another, commonly more low-resource, language.

Papers

Showing 2130 of 782 papers

TitleStatusHype
Overcoming Vocabulary Constraints with Pixel-level Fallback0
Bridging the Linguistic Divide: A Survey on Leveraging Large Language Models for Machine Translation0
JiraiBench: A Bilingual Benchmark for Evaluating Large Language Models' Detection of Human Self-Destructive Behavior Content in Jirai Community0
Enhancing Small Language Models for Cross-Lingual Generalized Zero-Shot Classification with Soft Prompt Tuning0
Untangling the Influence of Typology, Data and Model Architecture on Ranking Transfer Languages for Cross-Lingual POS Tagging0
Language-specific Neurons Do Not Facilitate Cross-Lingual Transfer0
Florenz: Scaling Laws for Systematic Generalization in Vision-Language Models0
A Zero-shot Learning Method Based on Large Language Models for Multi-modal Knowledge Graph Embedding0
Comparative Study of Zero-Shot Cross-Lingual Transfer for Bodo POS and NER Tagging Using Gemini 2.0 Flash Thinking Experimental Model0
Char-mander Use mBackdoor! A Study of Cross-lingual Backdoor Attacks in Multilingual LLMsCode0
Show:102550
← PrevPage 3 of 79Next →

Benchmark Results

#ModelMetricClaimedVerifiedStatus
1PaLM 2 (few-shot)Accuracy94.4Unverified
2mT0-13BAccuracy84.45Unverified
3RoBERTa Large (translate test)Accuracy76.05Unverified
4BLOOMZAccuracy75.5Unverified
5MAD-X BaseAccuracy60.94Unverified
6mGPTAccuracy55.5Unverified