SOTAVerified

Cross-Lingual Transfer

Cross-lingual transfer refers to transfer learning using data and models available for one language for which ample such resources are available (e.g., English) to solve tasks in another, commonly more low-resource, language.

Papers

Showing 521530 of 782 papers

TitleStatusHype
From Monolingual to Multilingual FAQ Assistant using Multilingual Co-training0
From Zero to Hero: On the Limitations of Zero-Shot Cross-Lingual Transfer with Multilingual Transformers0
Frustratingly Easy Cross-Lingual Transfer for Transition-Based Dependency Parsing0
Frustratingly Simple Regularization to Improve Zero-shot Cross-lingual Robustness0
GCDH@LT-EDI-EACL2021: XLM-RoBERTa for Hope Speech Detection in English, Malayalam, and Tamil0
Generalization Measures for Zero-Shot Cross-Lingual Transfer0
GlossReader at SemEval-2021 Task 2: Reading Definitions Improves Contextualized Word Embeddings0
HanjaBridge: Resolving Semantic Ambiguity in Korean LLMs via Hanja-Augmented Pre-Training0
How do languages influence each other? Studying cross-lingual data sharing during LM fine-tuning0
How Do Multilingual Encoders Learn Cross-lingual Representation?0
Show:102550
← PrevPage 53 of 79Next →

Benchmark Results

#ModelMetricClaimedVerifiedStatus
1PaLM 2 (few-shot)Accuracy94.4Unverified
2mT0-13BAccuracy84.45Unverified
3RoBERTa Large (translate test)Accuracy76.05Unverified
4BLOOMZAccuracy75.5Unverified
5MAD-X BaseAccuracy60.94Unverified
6mGPTAccuracy55.5Unverified