SOTAVerified

Cross-Lingual Transfer

Cross-lingual transfer refers to transfer learning using data and models available for one language for which ample such resources are available (e.g., English) to solve tasks in another, commonly more low-resource, language.

Papers

Showing 6170 of 782 papers

TitleStatusHype
Cross-lingual Back-Parsing: Utterance Synthesis from Meaning Representation for Zero-Resource Semantic ParsingCode0
Evaluating and explaining training strategies for zero-shot cross-lingual news sentiment analysis0
EMMA-500: Enhancing Massively Multilingual Adaptation of Large Language ModelsCode0
Cross-lingual transfer of multilingual models on low resource African LanguagesCode0
Exploring the Impact of Data Quantity on ASR in Extremely Low-resource Languages0
SpeechTaxi: On Multilingual Semantic Speech Classification0
A multilingual training strategy for low resource Text to Speech0
Exploring Multiple Strategies to Improve Multilingual Coreference Resolution in CorefUDCode0
Defining Boundaries: The Impact of Domain Specification on Cross-Language and Cross-Domain Transfer in Machine Translation0
RedWhale: An Adapted Korean LLM Through Efficient Continual Pretraining0
Show:102550
← PrevPage 7 of 79Next →

Benchmark Results

#ModelMetricClaimedVerifiedStatus
1PaLM 2 (few-shot)Accuracy94.4Unverified
2mT0-13BAccuracy84.45Unverified
3RoBERTa Large (translate test)Accuracy76.05Unverified
4BLOOMZAccuracy75.5Unverified
5MAD-X BaseAccuracy60.94Unverified
6mGPTAccuracy55.5Unverified