SOTAVerified

Low Resource NMT

Papers

Showing 110 of 34 papers

TitleStatusHype
ConsistTL: Modeling Consistency in Transfer Learning for Low-Resource Neural Machine TranslationCode1
Towards Better Chinese-centric Neural Machine Translation for Low-resource LanguagesCode1
JASS: Japanese-specific Sequence to Sequence Pre-training for Neural Machine TranslationCode1
Beyond Vanilla Fine-Tuning: Leveraging Multistage, Multilingual, and Domain-Specific Methods for Low-Resource Machine Translation0
From Priest to Doctor: Domain Adaptaion for Low-Resource Neural Machine TranslationCode0
Quantity vs. Quality of Monolingual Source Data in Automatic Text Translation: Can It Be Too Little If It Is Too Good?0
High-Quality Data Augmentation for Low-Resource NMT: Combining a Translation Memory, a GAN Generator, and Filtering0
Enhancing Low-Resource NMT with a Multilingual Encoder and Knowledge Distillation: A Case StudyCode0
Low-resource neural machine translation with morphological modelingCode0
Direct Neural Machine Translation with Task-level Mixture of Experts models0
Show:102550
← PrevPage 1 of 4Next →

No leaderboard results yet.