SOTAVerified

Cross-Lingual Transfer

Cross-lingual transfer refers to transfer learning using data and models available for one language for which ample such resources are available (e.g., English) to solve tasks in another, commonly more low-resource, language.

Papers

Showing 4150 of 782 papers

TitleStatusHype
Cross-View Language Modeling: Towards Unified Cross-Lingual Cross-Modal Pre-trainingCode1
The Importance of Being Parameters: An Intra-Distillation Method for Serious GainsCode1
The Geometry of Multilingual Language Model RepresentationsCode1
Pre-training Data Quality and Quantity for a Low-Resource Language: New Corpus and BERT Models for MalteseCode1
Enhancing Cross-lingual Transfer by Manifold MixupCode1
Towards Making the Most of Cross-Lingual Transfer for Zero-Shot Neural Machine TranslationCode1
Make the Best of Cross-lingual Transfer: Evidence from POS Tagging with over 100 LanguagesCode1
IndicXNLI: Evaluating Multilingual Inference for Indian LanguagesCode1
Adapting Pre-trained Language Models to African Languages via Multilingual Adaptive Fine-TuningCode1
Few-Shot Cross-lingual Transfer for Coarse-grained De-identification of Code-Mixed Clinical TextsCode1
Show:102550
← PrevPage 5 of 79Next →

Benchmark Results

#ModelMetricClaimedVerifiedStatus
1PaLM 2 (few-shot)Accuracy94.4Unverified
2mT0-13BAccuracy84.45Unverified
3RoBERTa Large (translate test)Accuracy76.05Unverified
4BLOOMZAccuracy75.5Unverified
5MAD-X BaseAccuracy60.94Unverified
6mGPTAccuracy55.5Unverified