SOTAVerified

Cross-Lingual Transfer

Cross-lingual transfer refers to transfer learning using data and models available for one language for which ample such resources are available (e.g., English) to solve tasks in another, commonly more low-resource, language.

Papers

Showing 491500 of 782 papers

TitleStatusHype
UniteD-SRL: A Unified Dataset for Span- and Dependency-Based Multilingual and Cross-Lingual Semantic Role LabelingCode0
Japanese Zero Anaphora Resolution Can Benefit from Parallel Texts Through Neural Transfer Learning0
Learning Cross-lingual Representations for Event Coreference Resolution with Multi-view Alignment and Optimal Transport0
On the Benefit of Syntactic Supervision for Cross-lingual Transfer in Semantic Role Labeling0
Zero-Shot Cross-Lingual Transfer is a Hard Baseline to Beat in German Fine-Grained Entity Typing0
Limitations of Knowledge Distillation for Zero-shot Transfer Learning0
MAD-G: Multilingual Adapter Generation for Efficient Cross-Lingual Transfer0
Sequence Mixup for Zero-Shot Cross-Lingual Part-Of-Speech Tagging0
Multilingual Domain Adaptation for NMT: Decoupling Language and Domain Information with Adapters0
Cross-lingual Constituency Parsing with Linguistic Typology Knowledge0
Show:102550
← PrevPage 50 of 79Next →

Benchmark Results

#ModelMetricClaimedVerifiedStatus
1PaLM 2 (few-shot)Accuracy94.4Unverified
2mT0-13BAccuracy84.45Unverified
3RoBERTa Large (translate test)Accuracy76.05Unverified
4BLOOMZAccuracy75.5Unverified
5MAD-X BaseAccuracy60.94Unverified
6mGPTAccuracy55.5Unverified