SOTAVerified

Cross-Lingual Transfer

Cross-lingual transfer refers to transfer learning using data and models available for one language for which ample such resources are available (e.g., English) to solve tasks in another, commonly more low-resource, language.

Papers

Showing 681690 of 782 papers

TitleStatusHype
Language Chameleon: Transformation analysis between languages using Cross-lingual Post-training based on Pre-trained language models0
Language Contamination Helps Explain the Cross-lingual Capabilities of English Pretrained Models0
Language-Family Adapters for Multilingual Neural Machine Translation0
Language-Family Adapters for Low-Resource Multilingual Neural Machine Translation0
Language-independent Cross-lingual Contextual Representations0
Language Scaling for Universal Suggested Replies Model0
Language-specific Neurons Do Not Facilitate Cross-Lingual Transfer0
Languages You Know Influence Those You Learn: Impact of Language Characteristics on Multi-Lingual Text-to-Text Transfer0
Layer Swapping for Zero-Shot Cross-Lingual Transfer in Large Language Models0
Learning Cross-lingual Representations for Event Coreference Resolution with Multi-view Alignment and Optimal Transport0
Show:102550
← PrevPage 69 of 79Next →

Benchmark Results

#ModelMetricClaimedVerifiedStatus
1PaLM 2 (few-shot)Accuracy94.4Unverified
2mT0-13BAccuracy84.45Unverified
3RoBERTa Large (translate test)Accuracy76.05Unverified
4BLOOMZAccuracy75.5Unverified
5MAD-X BaseAccuracy60.94Unverified
6mGPTAccuracy55.5Unverified