SOTAVerified

Cross-Lingual Transfer

Cross-lingual transfer refers to transfer learning using data and models available for one language for which ample such resources are available (e.g., English) to solve tasks in another, commonly more low-resource, language.

Papers

Showing 481490 of 782 papers

TitleStatusHype
A Study of Cross-Lingual Ability and Language-specific Information in Multilingual BERT0
A Survey Of Cross-lingual Word Embedding Models0
A Survey of Multilingual Models for Automatic Speech Recognition0
A Systematic Analysis of Subwords and Cross-Lingual Transfer in Multilingual Translation0
A Three-Pronged Approach to Cross-Lingual Adaptation with Multilingual LLMs0
Automatic Interlinear Glossing for Under-Resourced Languages Leveraging Translations0
Auxiliary Subword Segmentations as Related Languages for Low Resource Multilingual Translation0
A Zero-shot Learning Method Based on Large Language Models for Multi-modal Knowledge Graph Embedding0
BAD-X: Bilingual Adapters Improve Zero-Shot Cross-Lingual Transfer0
Bailong: Bilingual Transfer Learning based on QLoRA and Zip-tie Embedding0
Show:102550
← PrevPage 49 of 79Next →

Benchmark Results

#ModelMetricClaimedVerifiedStatus
1PaLM 2 (few-shot)Accuracy94.4Unverified
2mT0-13BAccuracy84.45Unverified
3RoBERTa Large (translate test)Accuracy76.05Unverified
4BLOOMZAccuracy75.5Unverified
5MAD-X BaseAccuracy60.94Unverified
6mGPTAccuracy55.5Unverified