SOTAVerified

Cross-Lingual Transfer

Cross-lingual transfer refers to transfer learning using data and models available for one language for which ample such resources are available (e.g., English) to solve tasks in another, commonly more low-resource, language.

Papers

Showing 271280 of 782 papers

TitleStatusHype
Languages You Know Influence Those You Learn: Impact of Language Characteristics on Multi-Lingual Text-to-Text Transfer0
Cross-lingual Similarity of Multilingual Representations RevisitedCode0
Domain Mismatch Doesn't Always Prevent Cross-Lingual Transfer Learning0
Frustratingly Easy Label Projection for Cross-lingual TransferCode1
Word-Level Representation From Bytes For Language Modeling0
Unified Question Answering in SloveneCode0
ALIGN-MLM: Word Embedding Alignment is Crucial for Multilingual Pre-trainingCode1
SexWEs: Domain-Aware Word Embeddings via Cross-lingual Semantic Specialisation for Chinese Sexism Detection in Social MediaCode0
Speaking Multiple Languages Affects the Moral Bias of Language ModelsCode0
GreenPLM: Cross-Lingual Transfer of Monolingual Pre-Trained Language Models at Almost No CostCode1
Show:102550
← PrevPage 28 of 79Next →

Benchmark Results

#ModelMetricClaimedVerifiedStatus
1PaLM 2 (few-shot)Accuracy94.4Unverified
2mT0-13BAccuracy84.45Unverified
3RoBERTa Large (translate test)Accuracy76.05Unverified
4BLOOMZAccuracy75.5Unverified
5MAD-X BaseAccuracy60.94Unverified
6mGPTAccuracy55.5Unverified