SOTAVerified

Cross-Lingual Transfer

Cross-lingual transfer refers to transfer learning using data and models available for one language for which ample such resources are available (e.g., English) to solve tasks in another, commonly more low-resource, language.

Papers

Showing 8190 of 782 papers

TitleStatusHype
K-Wav2vec 2.0: Automatic Speech Recognition based on Joint Decoding of Graphemes and SyllablesCode1
Learning Compact Metrics for MTCode1
End-to-End Slot Alignment and Recognition for Cross-Lingual NLUCode1
LEIA: Facilitating Cross-lingual Knowledge Transfer in Language Models with Entity-based Data AugmentationCode1
Bridging the Gap: Enhancing LLM Performance for Low-Resource African Languages with New Benchmarks, Fine-Tuning, and Cultural AdjustmentsCode1
Cross-lingual Aspect-based Sentiment Analysis with Aspect Term Code-SwitchingCode1
FILTER: An Enhanced Fusion Method for Cross-lingual Language UnderstandingCode1
It’s All in the Heads: Using Attention Heads as a Baseline for Cross-Lingual Transfer in Commonsense ReasoningCode1
mPLM-Sim: Better Cross-Lingual Similarity and Transfer in Multilingual Pretrained Language ModelsCode1
The Geometry of Multilingual Language Model RepresentationsCode1
Show:102550
← PrevPage 9 of 79Next →

Benchmark Results

#ModelMetricClaimedVerifiedStatus
1PaLM 2 (few-shot)Accuracy94.4Unverified
2mT0-13BAccuracy84.45Unverified
3RoBERTa Large (translate test)Accuracy76.05Unverified
4BLOOMZAccuracy75.5Unverified
5MAD-X BaseAccuracy60.94Unverified
6mGPTAccuracy55.5Unverified