SOTAVerified

Iterative Batch Back-Translation for Neural Machine Translation: A Conceptual Model

2019-11-26Unverified0· sign in to hype

Idris Abdulmumin, Bashir Shehu Galadanci, Abubakar Isa

Unverified — Be the first to reproduce this paper.

Reproduce

Abstract

An effective method to generate a large number of parallel sentences for training improved neural machine translation (NMT) systems is the use of back-translations of the target-side monolingual data. Recently, iterative back-translation has been shown to outperform standard back-translation albeit on some language pairs. This work proposes the iterative batch back-translation that is aimed at enhancing the standard iterative back-translation and enabling the efficient utilization of more monolingual data. After each iteration, improved back-translations of new sentences are added to the parallel data that will be used to train the final forward model. The work presents a conceptual model of the proposed approach.

Tasks

Reproductions