SOTAVerified

Cross-Lingual Transfer

Cross-lingual transfer refers to transfer learning using data and models available for one language for which ample such resources are available (e.g., English) to solve tasks in another, commonly more low-resource, language.

Papers

Showing 676700 of 782 papers

TitleStatusHype
Jointly Learning to Align and Summarize for Neural Cross-Lingual Summarization0
Jointly Learning to Embed and Predict with Multiple Languages0
JW300: A Wide-Coverage Parallel Corpus for Low-Resource Languages0
Key ingredients for effective zero-shot cross-lingual knowledge transfer in generative tasks0
Language and Task Arithmetic with Parameter-Efficient Layers for Zero-Shot Summarization0
Language Chameleon: Transformation analysis between languages using Cross-lingual Post-training based on Pre-trained language models0
Language Contamination Helps Explain the Cross-lingual Capabilities of English Pretrained Models0
Language-Family Adapters for Multilingual Neural Machine Translation0
Language-Family Adapters for Low-Resource Multilingual Neural Machine Translation0
Language-independent Cross-lingual Contextual Representations0
Language Scaling for Universal Suggested Replies Model0
Language-specific Neurons Do Not Facilitate Cross-Lingual Transfer0
Languages You Know Influence Those You Learn: Impact of Language Characteristics on Multi-Lingual Text-to-Text Transfer0
Layer Swapping for Zero-Shot Cross-Lingual Transfer in Large Language Models0
Learning Cross-lingual Representations for Event Coreference Resolution with Multi-view Alignment and Optimal Transport0
Learning from a Neighbor: Adapting a Japanese Parser for Korean Through Feature Transfer Learning0
Learning Invariant Representations on Multilingual Language Models for Unsupervised Cross-Lingual Transfer0
Learning Monolingual Compositional Representations via Bilingual Supervision0
Learning to Learn Morphological Inflection for Resource-Poor Languages0
Learning Transfers over Several Programming Languages0
Learn to Cross-lingual Transfer with Meta Graph Learning Across Heterogeneous Languages0
Leveraging Multi-lingual Positive Instances in Contrastive Learning to Improve Sentence Embedding0
Leveraging Text Data Using Hybrid Transformer-LSTM Based End-to-End ASR in Transfer Learning0
LexFit: Lexical Fine-Tuning of Pretrained Language Models0
Limitations of Knowledge Distillation for Zero-shot Transfer Learning0
Show:102550
← PrevPage 28 of 32Next →

Benchmark Results

#ModelMetricClaimedVerifiedStatus
1PaLM 2 (few-shot)Accuracy94.4Unverified
2mT0-13BAccuracy84.45Unverified
3RoBERTa Large (translate test)Accuracy76.05Unverified
4BLOOMZAccuracy75.5Unverified
5MAD-X BaseAccuracy60.94Unverified
6mGPTAccuracy55.5Unverified