SOTAVerified

Low Resource NMT

Papers

Showing 110 of 34 papers

TitleStatusHype
ConsistTL: Modeling Consistency in Transfer Learning for Low-Resource Neural Machine TranslationCode1
JASS: Japanese-specific Sequence to Sequence Pre-training for Neural Machine TranslationCode1
Towards Better Chinese-centric Neural Machine Translation for Low-resource LanguagesCode1
A Systematic Study Reveals Unexpected Interactions in Pre-Trained Neural Machine Translation0
Bilingual Low-Resource Neural Machine Translation with Round-Tripping: The Case of Persian-Spanish0
Beyond Vanilla Fine-Tuning: Leveraging Multistage, Multilingual, and Domain-Specific Methods for Low-Resource Machine Translation0
Data Augmentation for Sign Language Gloss Translation0
Controlling Formality in Low-Resource NMT with Domain Adaptation and Re-Ranking: SLT-CDT-UoS at IWSLT20220
Direct Neural Machine Translation with Task-level Mixture of Experts models0
AUGVIC: Exploiting BiText Vicinity for Low-Resource NMT0
Show:102550
← PrevPage 1 of 4Next →

No leaderboard results yet.