SOTAVerified

Cross-Lingual Transfer

Cross-lingual transfer refers to transfer learning using data and models available for one language for which ample such resources are available (e.g., English) to solve tasks in another, commonly more low-resource, language.

Papers

Showing 7180 of 782 papers

TitleStatusHype
Beto, Bentz, Becas: The Surprising Cross-Lingual Effectiveness of BERTCode1
Cross-Lingual Natural Language Generation via Pre-TrainingCode1
From One to Many: Expanding the Scope of Toxicity Mitigation in Language ModelsCode1
Frustratingly Easy Label Projection for Cross-lingual TransferCode1
ColBERT-XM: A Modular Multi-Vector Representation Model for Zero-Shot Multilingual Information RetrievalCode1
CONCRETE: Improving Cross-lingual Fact-checking with Cross-lingual RetrievalCode1
Composable Sparse Fine-Tuning for Cross-Lingual TransferCode1
Inducing Language-Agnostic Multilingual RepresentationsCode1
FILTER: An Enhanced Fusion Method for Cross-lingual Language UnderstandingCode1
Investigating Transfer Learning in Multilingual Pre-trained Language Models through Chinese Natural Language InferenceCode1
Show:102550
← PrevPage 8 of 79Next →

Benchmark Results

#ModelMetricClaimedVerifiedStatus
1PaLM 2 (few-shot)Accuracy94.4Unverified
2mT0-13BAccuracy84.45Unverified
3RoBERTa Large (translate test)Accuracy76.05Unverified
4BLOOMZAccuracy75.5Unverified
5MAD-X BaseAccuracy60.94Unverified
6mGPTAccuracy55.5Unverified