SOTAVerified

Cross-Lingual Transfer

Cross-lingual transfer refers to transfer learning using data and models available for one language for which ample such resources are available (e.g., English) to solve tasks in another, commonly more low-resource, language.

Papers

Showing 476500 of 782 papers

TitleStatusHype
How to Translate Your Samples and Choose Your Shots? Analyzing Translate-train & Few-shot Cross-lingual Transfer0
OneAligner: Zero-shot Cross-lingual Transfer with One Rich-Resource Language Pair for Low-Resource Sentence Retrieval0
Language-Family Adapters for Multilingual Neural Machine Translation0
Data-adaptive Transfer Learning for Low-resource Translation: A Case Study in Haitian0
Make the Best of Cross-lingual Transfer: Evidence from POS Tagging with over 100 Languages0
Zero-Shot Dependency Parsing with Worst-Case Aware Automated Curriculum Learning0
INDICXNLI: A Dataset for Studying NLI in Indic Languages0
Persian Natural Language Inference: A Meta-learning approach0
Deciphering Speech: a Zero-Resource Approach to Cross-Lingual Transfer in ASR0
Training Cross-Lingual embeddings for Setswana and SepediCode0
Cross-lingual Adaption Model-Agnostic Meta-Learning for Natural Language Understanding0
Cross-lingual Transfer for Speech Processing using Acoustic Language SimilarityCode0
Chinese Opinion Role Labeling with Corpus Translation: A Pivot StudyCode0
“Wikily” Supervised Neural Translation Tailored to Cross-Lingual Tasks0
Frustratingly Simple but Surprisingly Strong: Using Language-Independent Features for Zero-shot Cross-lingual Semantic ParsingCode0
UniteD-SRL: A Unified Dataset for Span- and Dependency-Based Multilingual and Cross-Lingual Semantic Role LabelingCode0
Japanese Zero Anaphora Resolution Can Benefit from Parallel Texts Through Neural Transfer Learning0
Learning Cross-lingual Representations for Event Coreference Resolution with Multi-view Alignment and Optimal Transport0
On the Benefit of Syntactic Supervision for Cross-lingual Transfer in Semantic Role Labeling0
Zero-Shot Cross-Lingual Transfer is a Hard Baseline to Beat in German Fine-Grained Entity Typing0
Limitations of Knowledge Distillation for Zero-shot Transfer Learning0
MAD-G: Multilingual Adapter Generation for Efficient Cross-Lingual Transfer0
Sequence Mixup for Zero-Shot Cross-Lingual Part-Of-Speech Tagging0
Multilingual Domain Adaptation for NMT: Decoupling Language and Domain Information with Adapters0
Cross-lingual Constituency Parsing with Linguistic Typology Knowledge0
Show:102550
← PrevPage 20 of 32Next →

Benchmark Results

#ModelMetricClaimedVerifiedStatus
1PaLM 2 (few-shot)Accuracy94.4Unverified
2mT0-13BAccuracy84.45Unverified
3RoBERTa Large (translate test)Accuracy76.05Unverified
4BLOOMZAccuracy75.5Unverified
5MAD-X BaseAccuracy60.94Unverified
6mGPTAccuracy55.5Unverified