SOTAVerified

Cross-Lingual Transfer

Cross-lingual transfer refers to transfer learning using data and models available for one language for which ample such resources are available (e.g., English) to solve tasks in another, commonly more low-resource, language.

Papers

Showing 501525 of 782 papers

TitleStatusHype
Zero-resource Dependency Parsing: Boosting Delexicalized Cross-lingual Transfer with Linguistic Knowledge0
Zero-Resource Multilingual Model Transfer: Learning What to Share0
Zero-shot Cross-Language Transfer of Monolingual Entity Linking Models0
Zero-Shot Cross-lingual Classification Using Multilingual Neural Machine Translation0
Zero-Shot Cross-Lingual Sentiment Classification under Distribution Shift: an Exploratory Study0
Zero-shot Cross-Lingual Transfer for Synthetic Data Generation in Grammatical Error Detection0
Zero-shot cross-lingual transfer in instruction tuning of large language models0
Zero-Shot Cross-Lingual Transfer is a Hard Baseline to Beat in German Fine-Grained Entity Typing0
Zero-shot Cross-lingual Transfer is Under-specified Optimization0
Zero-shot Cross-lingual Transfer is Under-specified Optimization0
Zero-shot cross-lingual transfer language selection using linguistic similarity0
Zero-shot Cross-lingual Transfer Learning with Multiple Source and Target Languages for Information Extraction: Language Selection and Adversarial Training0
Zero-shot Cross-lingual Transfer without Parallel Corpus0
Zero-Shot Dependency Parsing with Worst-Case Aware Automated Curriculum Learning0
Zero-shot Reading Comprehension by Cross-lingual Transfer Learning with Multi-lingual Language Representation Model0
Fine-Tuning BERT with Character-Level Noise for Zero-Shot Transfer to Dialects and Closely-Related Languages0
Florenz: Scaling Laws for Systematic Generalization in Vision-Language Models0
Fortification of Neural Morphological Segmentation Models for Polysynthetic Minimal-Resource Languages0
FrameNet on the Way to Babel: Creating a Bilingual FrameNet Using Wiktionary as Interlingual Connection0
FreeTransfer-X: Safe and Label-Free Cross-Lingual Transfer from Off-the-Shelf Models0
From Monolingual to Multilingual FAQ Assistant using Multilingual Co-training0
From Zero to Hero: On the Limitations of Zero-Shot Cross-Lingual Transfer with Multilingual Transformers0
Frustratingly Easy Cross-Lingual Transfer for Transition-Based Dependency Parsing0
Frustratingly Simple Regularization to Improve Zero-shot Cross-lingual Robustness0
GCDH@LT-EDI-EACL2021: XLM-RoBERTa for Hope Speech Detection in English, Malayalam, and Tamil0
Show:102550
← PrevPage 21 of 32Next →

Benchmark Results

#ModelMetricClaimedVerifiedStatus
1PaLM 2 (few-shot)Accuracy94.4Unverified
2mT0-13BAccuracy84.45Unverified
3RoBERTa Large (translate test)Accuracy76.05Unverified
4BLOOMZAccuracy75.5Unverified
5MAD-X BaseAccuracy60.94Unverified
6mGPTAccuracy55.5Unverified