SOTAVerified

Cross-Lingual Transfer

Cross-lingual transfer refers to transfer learning using data and models available for one language for which ample such resources are available (e.g., English) to solve tasks in another, commonly more low-resource, language.

Papers

Showing 8190 of 782 papers

TitleStatusHype
Cross-Lingual Transfer Learning for Speech Translation0
Too Late to Train, Too Early To Use? A Study on Necessity and Viability of Low-Resource Bengali LLMs0
Self-Translate-Train: Enhancing Cross-Lingual Transfer of Large Language Models via Inherent Capability0
Breaking the Script Barrier in Multilingual Pre-Trained Language Models with Transliteration-Based Post-Training AlignmentCode0
T-FREE: Subword Tokenizer-Free Generative LLMs via Sparse Representations for Memory-Efficient EmbeddingsCode2
The Model Arena for Cross-lingual Sentiment Analysis: A Comparative Study in the Era of Large Language Models0
SSP: Self-Supervised Prompting for Cross-Lingual Transfer to Low-Resource Languages using Large Language ModelsCode0
The Multilingual Alignment Prism: Aligning Global and Local Preferences to Reduce Harm0
A Three-Pronged Approach to Cross-Lingual Adaptation with Multilingual LLMs0
Large Language Models Are Cross-Lingual Knowledge-Free ReasonersCode0
Show:102550
← PrevPage 9 of 79Next →

Benchmark Results

#ModelMetricClaimedVerifiedStatus
1PaLM 2 (few-shot)Accuracy94.4Unverified
2mT0-13BAccuracy84.45Unverified
3RoBERTa Large (translate test)Accuracy76.05Unverified
4BLOOMZAccuracy75.5Unverified
5MAD-X BaseAccuracy60.94Unverified
6mGPTAccuracy55.5Unverified