SOTAVerified

Low Resource NMT

Papers

Showing 1120 of 34 papers

TitleStatusHype
Joint Dropout: Improving Generalizability in Low-Resource Neural Machine Translation through Phrase Pair VariablesCode0
HFT: High Frequency Tokens for Low-Resource NMTCode0
FeatureBART: Feature Based Sequence-to-Sequence Pre-Training for Low-Resource NMT0
Taking Actions Separately: A Bidirectionally-Adaptive Transfer Learning Method for Low-Resource Neural Machine Translation0
A Systematic Study Reveals Unexpected Interactions in Pre-Trained Neural Machine Translation0
Controlling Formality in Low-Resource NMT with Domain Adaptation and Re-Ranking: SLT-CDT-UoS at IWSLT20220
Machine Translation for Livonian: Catering to 20 Speakers0
On the Effectiveness of Quasi Character-Level Models for Machine Translation0
Sicilian Translator: A Recipe for Low-Resource NMTCode0
On the Effectiveness of Quasi Character-Level Models for Machine Translation0
Show:102550
← PrevPage 2 of 4Next →

No leaderboard results yet.