SOTAVerified

Cross-Lingual Transfer

Cross-lingual transfer refers to transfer learning using data and models available for one language for which ample such resources are available (e.g., English) to solve tasks in another, commonly more low-resource, language.

Papers

Showing 326350 of 782 papers

TitleStatusHype
One For All & All For One: Bypassing Hyperparameter Tuning with Model Averaging For Cross-Lingual TransferCode0
ToPro: Token-Level Prompt Decomposition for Cross-Lingual Sequence Labeling TasksCode0
Oolong: Investigating What Makes Transfer Learning Hard with Controlled StudiesCode0
Overlap-based Vocabulary Generation Improves Cross-lingual Transfer Among Related LanguagesCode0
How to (Properly) Evaluate Cross-Lingual Word Embeddings: On Strong Baselines, Comparative Analyses, and Some MisconceptionsCode0
Boosting Cross-Lingual Transfer via Self-Learning with Uncertainty EstimationCode0
Cross-Lingual Approaches to Reference Resolution in Dialogue Systems0
Efficiently Adapting Pretrained Language Models To New Languages0
A Systematic Analysis of Subwords and Cross-Lingual Transfer in Multilingual Translation0
Efficient Cross-Lingual Transfer for Chinese Stable Diffusion with Images as Pivots0
Effects of Language Relatedness for Cross-lingual Transfer Learning in Character-Based Language Models0
Cross-lingual and Supervised Models for Morphosyntactic Annotation: a Comparison on Romanian0
A Survey of Multilingual Models for Automatic Speech Recognition0
A Multi-dimensional Evaluation of Tokenizer-free Multilingual Pretrained Models0
Earth Mover's Distance Minimization for Unsupervised Bilingual Lexicon Induction0
Dynamic Gazetteer Integration in Multilingual Models for Cross-Lingual and Cross-Domain Named Entity Recognition0
Cross-lingual alignment transfer: a chicken-and-egg story?0
Dual-view Curricular Optimal Transport for Cross-lingual Cross-modal Retrieval0
Don't Parse, Insert: Multilingual Semantic Parsing with Insertion Based Decoding0
Cross-lingual Alignment Methods for Multilingual BERT: A Comparative Study0
A Survey Of Cross-lingual Word Embedding Models0
Don’t Forget Cheap Training Signals Before Building Unsupervised Bilingual Word Embeddings0
Cross-lingual Adaption Model-Agnostic Meta-Learning for Natural Language Understanding0
Don’t Forget Cheap Training Signals Before Building Unsupervised Bilingual Word Embeddings0
Don't Forget Cheap Training Signals Before Building Unsupervised Bilingual Word Embeddings0
Show:102550
← PrevPage 14 of 32Next →

Benchmark Results

#ModelMetricClaimedVerifiedStatus
1PaLM 2 (few-shot)Accuracy94.4Unverified
2mT0-13BAccuracy84.45Unverified
3RoBERTa Large (translate test)Accuracy76.05Unverified
4BLOOMZAccuracy75.5Unverified
5MAD-X BaseAccuracy60.94Unverified
6mGPTAccuracy55.5Unverified