SOTAVerified

Cross-Lingual Transfer

Cross-lingual transfer refers to transfer learning using data and models available for one language for which ample such resources are available (e.g., English) to solve tasks in another, commonly more low-resource, language.

Papers

Showing 581590 of 782 papers

TitleStatusHype
AmericasNLI: Evaluating Zero-shot Natural Language Understanding of Pretrained Multilingual Models in Truly Low-resource LanguagesCode0
Unknown Script: Impact of Script on Cross-Lingual TransferCode0
Explicit Alignment Objectives for Multilingual Bidirectional EncodersCode0
Frustratingly Simple but Surprisingly Strong: Using Language-Independent Features for Zero-shot Cross-lingual Semantic ParsingCode0
UNKs Everywhere: Adapting Multilingual Language Models to New ScriptsCode0
On the Applicability of Zero-Shot Cross-Lingual Transfer Learning for Sentiment Classification in Distant Language PairsCode0
Similarity of Sentence Representations in Multilingual LMs: Resolving Conflicting Literature and Case Study of Baltic LanguagesCode0
A Little Annotation does a Lot of Good: A Study in Bootstrapping Low-resource Named Entity RecognizersCode0
GL-CLeF: A Global-Local Contrastive Learning Framework for Cross-lingual Spoken Language UnderstandingCode0
GL-CLeF: A Global–Local Contrastive Learning Framework for Cross-lingual Spoken Language UnderstandingCode0
Show:102550
← PrevPage 59 of 79Next →

Benchmark Results

#ModelMetricClaimedVerifiedStatus
1PaLM 2 (few-shot)Accuracy94.4Unverified
2mT0-13BAccuracy84.45Unverified
3RoBERTa Large (translate test)Accuracy76.05Unverified
4BLOOMZAccuracy75.5Unverified
5MAD-X BaseAccuracy60.94Unverified
6mGPTAccuracy55.5Unverified