SOTAVerified

Cross-Lingual Transfer

Cross-lingual transfer refers to transfer learning using data and models available for one language for which ample such resources are available (e.g., English) to solve tasks in another, commonly more low-resource, language.

Papers

Showing 701710 of 782 papers

TitleStatusHype
Limited-Resource Adapters Are Regularizers, Not Linguists0
LLMs Are Globally Multilingual Yet Locally Monolingual: Exploring Knowledge Transfer via Language and Thought Theory0
Low-resource named entity recognition via multi-source projection: Not quite there yet?0
Low-Resource Syntactic Transfer with Unsupervised Source Reordering0
How Many Languages Make Good Multilingual Instruction Tuning? A Case Study on BLOOM0
Machine Translation for Livonian: Catering to 20 Speakers0
MAD-G: Multilingual Adapter Generation for Efficient Cross-Lingual Transfer0
Magic dust for cross-lingual adaptation of monolingual wav2vec-2.00
MaiNLP at SemEval-2024 Task 1: Analyzing Source Language Selection in Cross-Lingual Textual Relatedness0
Make the Best of Cross-lingual Transfer: Evidence from POS Tagging with over 100 Languages0
Show:102550
← PrevPage 71 of 79Next →

Benchmark Results

#ModelMetricClaimedVerifiedStatus
1PaLM 2 (few-shot)Accuracy94.4Unverified
2mT0-13BAccuracy84.45Unverified
3RoBERTa Large (translate test)Accuracy76.05Unverified
4BLOOMZAccuracy75.5Unverified
5MAD-X BaseAccuracy60.94Unverified
6mGPTAccuracy55.5Unverified