SOTAVerified

Cross-Lingual Transfer

Cross-lingual transfer refers to transfer learning using data and models available for one language for which ample such resources are available (e.g., English) to solve tasks in another, commonly more low-resource, language.

Papers

Showing 751760 of 782 papers

TitleStatusHype
Negation Scope Resolution for Chinese as a Second Language0
Neural Cross-Lingual Event Detection with Minimal Parallel Resources0
Neural Task Representations as Weak Supervision for Model Agnostic Cross-Lingual Transfer0
Neuron Specialization: Leveraging intrinsic task modularity for multilingual machine translation0
OneAligner: Zero-shot Cross-lingual Transfer with One Rich-Resource Language Pair for Low-Resource Sentence Retrieval0
OneAligner: Zero-shot Cross-lingual Transfer with One Rich-Resource Language Pair for Low-Resource Sentence Retrieval0
One-Shot Neural Cross-Lingual Transfer for Paradigm Completion0
One Step Is Enough for Few-Shot Cross-Lingual Transfer: Co-Training with Gradient Optimization0
On the ability of monolingual models to learn language-agnostic representations0
On the Benefit of Syntactic Supervision for Cross-lingual Transfer in Semantic Role Labeling0
Show:102550
← PrevPage 76 of 79Next →

Benchmark Results

#ModelMetricClaimedVerifiedStatus
1PaLM 2 (few-shot)Accuracy94.4Unverified
2mT0-13BAccuracy84.45Unverified
3RoBERTa Large (translate test)Accuracy76.05Unverified
4BLOOMZAccuracy75.5Unverified
5MAD-X BaseAccuracy60.94Unverified
6mGPTAccuracy55.5Unverified