SOTAVerified

Cross-Lingual Transfer

Cross-lingual transfer refers to transfer learning using data and models available for one language for which ample such resources are available (e.g., English) to solve tasks in another, commonly more low-resource, language.

Papers

Showing 291300 of 782 papers

TitleStatusHype
On the Calibration of Massively Multilingual Language ModelsCode1
A Simple and Effective Method to Improve Zero-Shot Cross-Lingual Transfer LearningCode0
Improving Low-Resource Cross-lingual Parsing with Expected Statistic RegularizationCode0
A Multi-dimensional Evaluation of Tokenizer-free Multilingual Pretrained Models0
You Can Have Your Data and Balance It Too: Towards Balanced and Efficient Multilingual Models0
Language Agnostic Multilingual Information Retrieval with Contrastive LearningCode0
The (In)Effectiveness of Intermediate Task Training For Domain Adaptation and Cross-Lingual Transfer Learning0
SLICER: Sliced Fine-Tuning for Low-Resource Cross-Lingual Transfer for Named Entity RecognitionCode0
Analyzing BERT Cross-lingual Transfer Capabilities in Continual Sequence LabelingCode0
Does Meta-learning Help mBERT for Few-shot Question Generation in a Cross-lingual Transfer Setting for Indic Languages?0
Show:102550
← PrevPage 30 of 79Next →

Benchmark Results

#ModelMetricClaimedVerifiedStatus
1PaLM 2 (few-shot)Accuracy94.4Unverified
2mT0-13BAccuracy84.45Unverified
3RoBERTa Large (translate test)Accuracy76.05Unverified
4BLOOMZAccuracy75.5Unverified
5MAD-X BaseAccuracy60.94Unverified
6mGPTAccuracy55.5Unverified