SOTAVerified

Cross-Lingual Transfer

Cross-lingual transfer refers to transfer learning using data and models available for one language for which ample such resources are available (e.g., English) to solve tasks in another, commonly more low-resource, language.

Papers

Showing 511520 of 782 papers

TitleStatusHype
Zero-shot cross-lingual transfer language selection using linguistic similarity0
Zero-shot Cross-lingual Transfer Learning with Multiple Source and Target Languages for Information Extraction: Language Selection and Adversarial Training0
Zero-shot Cross-lingual Transfer without Parallel Corpus0
Zero-Shot Dependency Parsing with Worst-Case Aware Automated Curriculum Learning0
Zero-shot Reading Comprehension by Cross-lingual Transfer Learning with Multi-lingual Language Representation Model0
Fine-Tuning BERT with Character-Level Noise for Zero-Shot Transfer to Dialects and Closely-Related Languages0
Florenz: Scaling Laws for Systematic Generalization in Vision-Language Models0
Fortification of Neural Morphological Segmentation Models for Polysynthetic Minimal-Resource Languages0
FrameNet on the Way to Babel: Creating a Bilingual FrameNet Using Wiktionary as Interlingual Connection0
FreeTransfer-X: Safe and Label-Free Cross-Lingual Transfer from Off-the-Shelf Models0
Show:102550
← PrevPage 52 of 79Next →

Benchmark Results

#ModelMetricClaimedVerifiedStatus
1PaLM 2 (few-shot)Accuracy94.4Unverified
2mT0-13BAccuracy84.45Unverified
3RoBERTa Large (translate test)Accuracy76.05Unverified
4BLOOMZAccuracy75.5Unverified
5MAD-X BaseAccuracy60.94Unverified
6mGPTAccuracy55.5Unverified