SOTAVerified

Adobe AMPS’s Submission for Very Low Resource Supervised Translation Task at WMT20

2020-11-01WMT (EMNLP) 2020Unverified0· sign in to hype

Keshaw Singh

Unverified — Be the first to reproduce this paper.

Reproduce

Abstract

In this paper, we describe our systems submitted to the very low resource supervised translation task at WMT20. We participate in both translation directions for Upper Sorbian-German language pair. Our primary submission is a subword-level Transformer-based neural machine translation model trained on original training bitext. We also conduct several experiments with backtranslation using limited monolingual data in our post-submission work and include our results for the same. In one such experiment, we observe jumps of up to 2.6 BLEU points over the primary system by pretraining on a synthetic, backtranslated corpus followed by fine-tuning on the original parallel training data.

Tasks

Reproductions