SOTAVerified

Cross-Lingual Transfer

Cross-lingual transfer refers to transfer learning using data and models available for one language for which ample such resources are available (e.g., English) to solve tasks in another, commonly more low-resource, language.

Papers

Showing 731740 of 782 papers

TitleStatusHype
MSVD-Indonesian: A Benchmark for Multimodal Video-Text Tasks in IndonesianCode0
MT4CrossOIE: Multi-stage Tuning for Cross-lingual Open Information ExtractionCode0
Simple and Effective Zero-shot Cross-lingual Phoneme RecognitionCode0
Single-/Multi-Source Cross-Lingual NER via Teacher-Student Learning on Unlabeled Data in Target LanguageCode0
Multi2WOZ: A Robust Multilingual Dataset and Conversational Pretraining for Task-Oriented DialogCode0
Zero-Resource Cross-Lingual Named Entity RecognitionCode0
Cross-Lingual Learning vs. Low-Resource Fine-Tuning: A Case Study with Fact-Checking in TurkishCode0
Cross-lingual Intermediate Fine-tuning improves Dialogue State TrackingCode0
Cross-lingual Emotion Intensity PredictionCode0
SLICER: Sliced Fine-Tuning for Low-Resource Cross-Lingual Transfer for Named Entity RecognitionCode0
Show:102550
← PrevPage 74 of 79Next →

Benchmark Results

#ModelMetricClaimedVerifiedStatus
1PaLM 2 (few-shot)Accuracy94.4Unverified
2mT0-13BAccuracy84.45Unverified
3RoBERTa Large (translate test)Accuracy76.05Unverified
4BLOOMZAccuracy75.5Unverified
5MAD-X BaseAccuracy60.94Unverified
6mGPTAccuracy55.5Unverified