SOTAVerified

Context-aware Neural Machine Translation with Mini-batch Embedding

2021-04-01EACL 2021Code Available0· sign in to hype

Makoto Morishita, Jun Suzuki, Tomoharu Iwata, Masaaki Nagata

Code Available — Be the first to reproduce this paper.

Reproduce

Code

Abstract

It is crucial to provide an inter-sentence context in Neural Machine Translation (NMT) models for higher-quality translation. With the aim of using a simple approach to incorporate inter-sentence information, we propose mini-batch embedding (MBE) as a way to represent the features of sentences in a mini-batch. We construct a mini-batch by choosing sentences from the same document, and thus the MBE is expected to have contextual information across sentences. Here, we incorporate MBE in an NMT model, and our experiments show that the proposed method consistently outperforms the translation capabilities of strong baselines and improves writing style or terminology to fit the document's context.

Tasks

Reproductions