SOTAVerified

Cross-Lingual Transfer

Cross-lingual transfer refers to transfer learning using data and models available for one language for which ample such resources are available (e.g., English) to solve tasks in another, commonly more low-resource, language.

Papers

Showing 631640 of 782 papers

TitleStatusHype
Exploring the Impact of Data Quantity on ASR in Extremely Low-resource Languages0
Exposing the limits of Zero-shot Cross-lingual Hate Speech Detection0
Extending Multilingual BERT to Low-Resource Languages0
Feature Aggregation in Zero-Shot Cross-Lingual Transfer Using Multilingual BERT0
Few-Shot Cross-Lingual Transfer for Prompting Large Language Models in Low-Resource Languages0
Fine-Tuning BERT with Character-Level Noise for Zero-Shot Transfer to Dialects and Closely-Related Languages0
Florenz: Scaling Laws for Systematic Generalization in Vision-Language Models0
Fortification of Neural Morphological Segmentation Models for Polysynthetic Minimal-Resource Languages0
FrameNet on the Way to Babel: Creating a Bilingual FrameNet Using Wiktionary as Interlingual Connection0
FreeTransfer-X: Safe and Label-Free Cross-Lingual Transfer from Off-the-Shelf Models0
Show:102550
← PrevPage 64 of 79Next →

Benchmark Results

#ModelMetricClaimedVerifiedStatus
1PaLM 2 (few-shot)Accuracy94.4Unverified
2mT0-13BAccuracy84.45Unverified
3RoBERTa Large (translate test)Accuracy76.05Unverified
4BLOOMZAccuracy75.5Unverified
5MAD-X BaseAccuracy60.94Unverified
6mGPTAccuracy55.5Unverified