SOTAVerified

Cross-Lingual Transfer

Cross-lingual transfer refers to transfer learning using data and models available for one language for which ample such resources are available (e.g., English) to solve tasks in another, commonly more low-resource, language.

Papers

Showing 551575 of 782 papers

TitleStatusHype
The interplay between language similarity and script on a novel multi-layer Algerian dialect corpusCode0
Multilingual Domain Adaptation for NMT: Decoupling Language and Domain Information with Adapters0
Analysing The Impact Of Linguistic Features On Cross-Lingual TransferCode0
Assessing the Syntactic Capabilities of Transformer-based Multilingual Language Models0
Multilingual and Zero-Shot is Closing in on Monolingual Web Register Classification0
Zero-shot Cross-lingual Transfer of Neural Machine Translation with Multilingual Pretrained EncodersCode0
AmericasNLI: Evaluating Zero-shot Natural Language Understanding of Pretrained Multilingual Models in Truly Low-resource LanguagesCode0
Crossing the Conversational Chasm: A Primer on Natural Language Processing for Multilingual Task-Oriented Dialogue Systems0
Improving Zero-Shot Cross-Lingual Transfer Learning via Robust TrainingCode0
"Wikily" Supervised Neural Translation Tailored to Cross-Lingual TasksCode0
Span Pointer Networks for Non-Autoregressive Task-Oriented Semantic Parsing0
Fine-tuning Encoders for Improved Monolingual and Zero-shot Polylingual Neural Topic ModelingCode0
MCL@IITK at SemEval-2021 Task 2: Multilingual and Cross-lingual Word-in-Context Disambiguation using Augmented Data, Signals, and Transformers0
Negation Scope Resolution for Chinese as a Second Language0
To Block or not to Block: Experiments with Machine Learning for News Comment Moderation0
Cross-Lingual Transfer Learning for Hate Speech Detection0
Cross-Lingual Transfer with MAML on Trees0
Project-then-Transfer: Effective Two-stage Cross-lingual Transfer for Semantic Dependency Parsing0
Improving Cross-Lingual Transfer for Event Argument Extraction with Language-Universal Sentence Structures0
GCDH@LT-EDI-EACL2021: XLM-RoBERTa for Hope Speech Detection in English, Malayalam, and Tamil0
Task-Specific Pre-Training and Cross Lingual Transfer for Sentiment Analysis in Dravidian Code-Switched Languages0
Meta-Learning with MAML on Trees0
Zero-Shot Cross-Lingual Dependency Parsing through Contextual Embedding TransformationCode0
Task-Specific Pre-Training and Cross Lingual Transfer for Code-Switched Data0
Bilingual Language Modeling, A transfer learning technique for Roman Urdu0
Show:102550
← PrevPage 23 of 32Next →

Benchmark Results

#ModelMetricClaimedVerifiedStatus
1PaLM 2 (few-shot)Accuracy94.4Unverified
2mT0-13BAccuracy84.45Unverified
3RoBERTa Large (translate test)Accuracy76.05Unverified
4BLOOMZAccuracy75.5Unverified
5MAD-X BaseAccuracy60.94Unverified
6mGPTAccuracy55.5Unverified