Sicilian Translator: A Recipe for Low-Resource NMT
2021-10-05Code Available0· sign in to hype
Eryk Wdowiak
Code Available — Be the first to reproduce this paper.
ReproduceCode
- github.com/ewdowiak/Sicilian_TranslatorOfficialmxnet★ 6
Abstract
With 17,000 pairs of Sicilian-English translated sentences, Arba Sicula developed the first neural machine translator for the Sicilian language. Using small subword vocabularies, we trained small Transformer models with high dropout parameters and achieved BLEU scores in the upper 20s. Then we supplemented our dataset with backtranslation and multilingual translation and pushed our scores into the mid 30s. We also attribute our success to incorporating theoretical information in our dataset. Prior to training, we biased the subword vocabulary towards the desinences one finds in a textbook. And we included textbook exercises in our dataset.
Tasks
Benchmark Results
| Dataset | Model | Metric | Claimed | Verified | Status |
|---|---|---|---|---|---|
| Arba Sicula | Larger | BLEU (En-Scn) | 35 | — | Unverified |
| Arba Sicula | Many-to-Many | BLEU (It-Scn) | 36.5 | — | Unverified |