SOTAVerified

NoahNMT at WMT 2021: Dual Transfer for Very Low Resource Supervised Machine Translation

2021-11-01WMT (EMNLP) 2021Unverified0· sign in to hype

Meng Zhang, Minghao Wu, Pengfei Li, Liangyou Li, Qun Liu

Unverified — Be the first to reproduce this paper.

Reproduce

Abstract

This paper describes the NoahNMT system submitted to the WMT 2021 shared task of Very Low Resource Supervised Machine Translation. The system is a standard Transformer model equipped with our recent technique of dual transfer. It also employs widely used techniques that are known to be helpful for neural machine translation, including iterative back-translation, selected finetuning, and ensemble. The final submission achieves the top BLEU for three translation directions.

Tasks

Reproductions