SOTAVerified

Cross-Lingual Transfer

Cross-lingual transfer refers to transfer learning using data and models available for one language for which ample such resources are available (e.g., English) to solve tasks in another, commonly more low-resource, language.

Papers

Showing 341350 of 782 papers

TitleStatusHype
Punctuation Restoration in Spanish Customer Support Transcripts using Transfer Learning0
Discovering Language-neutral Sub-networks in Multilingual Language ModelsCode0
Overcoming Catastrophic Forgetting in Zero-Shot Cross-Lingual GenerationCode2
Bitext Mining Using Distilled Sentence Representations for Low-Resource Languages0
Analyzing the Mono- and Cross-Lingual Pretraining Dynamics of Multilingual Language Models0
The Importance of Being Parameters: An Intra-Distillation Method for Serious GainsCode1
Local Byte Fusion for Neural Machine TranslationCode0
The Geometry of Multilingual Language Model RepresentationsCode1
Pre-training Data Quality and Quantity for a Low-Resource Language: New Corpus and BERT Models for MalteseCode1
Multi2WOZ: A Robust Multilingual Dataset and Conversational Pretraining for Task-Oriented DialogCode0
Show:102550
← PrevPage 35 of 79Next →

Benchmark Results

#ModelMetricClaimedVerifiedStatus
1PaLM 2 (few-shot)Accuracy94.4Unverified
2mT0-13BAccuracy84.45Unverified
3RoBERTa Large (translate test)Accuracy76.05Unverified
4BLOOMZAccuracy75.5Unverified
5MAD-X BaseAccuracy60.94Unverified
6mGPTAccuracy55.5Unverified