SOTAVerified

Cross-Lingual Transfer

Cross-lingual transfer refers to transfer learning using data and models available for one language for which ample such resources are available (e.g., English) to solve tasks in another, commonly more low-resource, language.

Papers

Showing 611620 of 782 papers

TitleStatusHype
End-to-end Text-to-speech for Low-resource Languages by Cross-Lingual Transfer Learning0
English Intermediate-Task Training Improves Zero-Shot Cross-Lingual Transfer Too0
English Intermediate-Task Training Improves Zero-Shot Cross-Lingual Transfer Too0
Enhancing Cross-lingual Prompting with Two-level Augmentation0
Enhancing Cross-task Transfer of Large Language Models via Activation Steering0
Enhancing LLM Language Adaption through Cross-lingual In-Context Pre-training0
Enhancing Small Language Models for Cross-Lingual Generalized Zero-Shot Classification with Soft Prompt Tuning0
EntityCS: Improving Zero-Shot Cross-lingual Transfer with Entity-Centric Code Switching0
Errator: a Tool to Help Detect Annotation Errors in the Universal Dependencies Project0
Evaluating and explaining training strategies for zero-shot cross-lingual news sentiment analysis0
Show:102550
← PrevPage 62 of 79Next →

Benchmark Results

#ModelMetricClaimedVerifiedStatus
1PaLM 2 (few-shot)Accuracy94.4Unverified
2mT0-13BAccuracy84.45Unverified
3RoBERTa Large (translate test)Accuracy76.05Unverified
4BLOOMZAccuracy75.5Unverified
5MAD-X BaseAccuracy60.94Unverified
6mGPTAccuracy55.5Unverified