SOTAVerified

Low-Resource Translation as Language Modeling

2020-11-01WMT (EMNLP) 2020Unverified0· sign in to hype

Tucker Berckmann, Berkan Hiziroglu

Unverified — Be the first to reproduce this paper.

Reproduce

Abstract

We present our submission to the very low resource supervised machine translation task at WMT20. We use a decoder-only transformer architecture and formulate the translation task as language modeling. To address the low-resource aspect of the problem, we pretrain over a similar language parallel corpus. Then, we employ an intermediate back-translation step before fine-tuning. Finally, we present an analysis of the system’s performance.

Tasks

Reproductions