SOTAVerified

Cross-Lingual Transfer

Cross-lingual transfer refers to transfer learning using data and models available for one language for which ample such resources are available (e.g., English) to solve tasks in another, commonly more low-resource, language.

Papers

Showing 251260 of 782 papers

TitleStatusHype
Efficient Language Model Training through Cross-Lingual and Progressive Transfer LearningCode0
Cross-lingual Annotation Projection in Legal TextsCode0
Efficiently Aligned Cross-Lingual Transfer Learning for Conversational Tasks using Prompt-TuningCode0
Analysing The Impact Of Linguistic Features On Cross-Lingual TransferCode0
Beyond Data Quantity: Key Factors Driving Performance in Multilingual Language ModelsCode0
Frustratingly Simple but Surprisingly Strong: Using Language-Independent Features for Zero-shot Cross-lingual Semantic ParsingCode0
GL-CLeF: A Global-Local Contrastive Learning Framework for Cross-lingual Spoken Language UnderstandingCode0
Cross-Lingual Learning vs. Low-Resource Fine-Tuning: A Case Study with Fact-Checking in TurkishCode0
Analysing Cross-Lingual Transfer in Low-Resourced African Named Entity RecognitionCode0
Cross-lingual Intermediate Fine-tuning improves Dialogue State TrackingCode0
Show:102550
← PrevPage 26 of 79Next →

Benchmark Results

#ModelMetricClaimedVerifiedStatus
1PaLM 2 (few-shot)Accuracy94.4Unverified
2mT0-13BAccuracy84.45Unverified
3RoBERTa Large (translate test)Accuracy76.05Unverified
4BLOOMZAccuracy75.5Unverified
5MAD-X BaseAccuracy60.94Unverified
6mGPTAccuracy55.5Unverified