SOTAVerified

Low Resource NMT

Papers

Showing 1120 of 34 papers

TitleStatusHype
Taking Actions Separately: A Bidirectionally-Adaptive Transfer Learning Method for Low-Resource Neural Machine Translation0
FeatureBART: Feature Based Sequence-to-Sequence Pre-Training for Low-Resource NMT0
A Systematic Study Reveals Unexpected Interactions in Pre-Trained Neural Machine Translation0
Controlling Formality in Low-Resource NMT with Domain Adaptation and Re-Ranking: SLT-CDT-UoS at IWSLT20220
Machine Translation for Livonian: Catering to 20 Speakers0
Towards Better Chinese-centric Neural Machine Translation for Low-resource LanguagesCode1
On the Effectiveness of Quasi Character-Level Models for Machine Translation0
Sicilian Translator: A Recipe for Low-Resource NMTCode0
On the Effectiveness of Quasi Character-Level Models for Machine Translation0
A Survey on Low-Resource Neural Machine Translation0
Show:102550
← PrevPage 2 of 4Next →

No leaderboard results yet.