SOTAVerified

Cross-Lingual Transfer

Cross-lingual transfer refers to transfer learning using data and models available for one language for which ample such resources are available (e.g., English) to solve tasks in another, commonly more low-resource, language.

Papers

Showing 5175 of 782 papers

TitleStatusHype
Few-Shot Cross-lingual Transfer for Coarse-grained De-identification of Code-Mixed Clinical TextsCode1
Make the Best of Cross-lingual Transfer: Evidence from POS Tagging with over 100 LanguagesCode1
Massively Multilingual Sentence Embeddings for Zero-Shot Cross-Lingual Transfer and BeyondCode1
mCLIP: Multilingual CLIP via Cross-lingual TransferCode1
Cross-Lingual Word Embedding Refinement by _1 Norm OptimisationCode1
Choosing Transfer Languages for Cross-Lingual LearningCode1
Allophant: Cross-lingual Phoneme Recognition with Articulatory AttributesCode1
Modelling Latent Translations for Cross-Lingual TransferCode1
MultiEURLEX -- A multi-lingual and multi-label legal document classification dataset for zero-shot cross-lingual transferCode1
MultiEURLEX - A multi-lingual and multi-label legal document classification dataset for zero-shot cross-lingual transferCode1
Multilingual Generative Language Models for Zero-Shot Cross-Lingual Event Argument ExtractionCode1
A Chinese Corpus for Fine-grained Entity TypingCode1
Code-Mixing on Sesame Street: Dawn of the Adversarial PolyglotsCode1
Cross-Lingual Word Embedding Refinement by _1 Norm OptimisationCode1
Cross-lingual Transfer for Text Classification with Dictionary-based Heterogeneous GraphCode1
BiToD: A Bilingual Multi-Domain Dataset For Task-Oriented Dialogue ModelingCode1
Bridging the Gap: Enhancing LLM Performance for Low-Resource African Languages with New Benchmarks, Fine-Tuning, and Cultural AdjustmentsCode1
Cross-View Language Modeling: Towards Unified Cross-Lingual Cross-Modal Pre-trainingCode1
Efficient Test Time Adapter Ensembling for Low-resource Language VarietiesCode1
End-to-End Slot Alignment and Recognition for Cross-Lingual NLUCode1
Beto, Bentz, Becas: The Surprising Cross-Lingual Effectiveness of BERTCode1
Cross-Lingual Natural Language Generation via Pre-TrainingCode1
Finding Universal Grammatical Relations in Multilingual BERTCode1
From One to Many: Expanding the Scope of Toxicity Mitigation in Language ModelsCode1
Few-shot Learning with Multilingual Language ModelsCode1
Show:102550
← PrevPage 3 of 32Next →

Benchmark Results

#ModelMetricClaimedVerifiedStatus
1PaLM 2 (few-shot)Accuracy94.4Unverified
2mT0-13BAccuracy84.45Unverified
3RoBERTa Large (translate test)Accuracy76.05Unverified
4BLOOMZAccuracy75.5Unverified
5MAD-X BaseAccuracy60.94Unverified
6mGPTAccuracy55.5Unverified