SOTAVerified

Low Resource NMT

Papers

Showing 2130 of 34 papers

TitleStatusHype
On the Effectiveness of Quasi Character-Level Models for Machine Translation0
Quantity vs. Quality of Monolingual Source Data in Automatic Text Translation: Can It Be Too Little If It Is Too Good?0
Taking Actions Separately: A Bidirectionally-Adaptive Transfer Learning Method for Low-Resource Neural Machine Translation0
A Survey on Low-Resource Neural Machine Translation0
AUGVIC: Exploiting BiText Vicinity for Low-Resource NMT0
A Systematic Study Reveals Unexpected Interactions in Pre-Trained Neural Machine Translation0
Beyond Vanilla Fine-Tuning: Leveraging Multistage, Multilingual, and Domain-Specific Methods for Low-Resource Machine Translation0
Bilingual Low-Resource Neural Machine Translation with Round-Tripping: The Case of Persian-Spanish0
Target Conditioned Sampling: Optimizing Data Selection for Multilingual Neural Machine Translation0
Controlling Formality in Low-Resource NMT with Domain Adaptation and Re-Ranking: SLT-CDT-UoS at IWSLT20220
Show:102550
← PrevPage 3 of 4Next →

No leaderboard results yet.