SOTAVerified

Almost Free Semantic Draft for Neural Machine Translation

2021-06-01NAACL 2021Unverified0· sign in to hype

Xi Ai, Bin Fang

Unverified — Be the first to reproduce this paper.

Reproduce

Abstract

Translation quality can be improved by global information from the required target sentence because the decoder can understand both past and future information. However, the model needs additional cost to produce and consider such global information. In this work, to inject global information but also save cost, we present an efficient method to sample and consider a semantic draft as global information from semantic space for decoding with almost free of cost. Unlike other successful adaptations, we do not have to perform an EM-like process that repeatedly samples a possible semantic from the semantic space. Empirical experiments show that the presented method can achieve competitive performance in common language pairs with a clear advantage in inference efficiency. We will open all our source code on GitHub.

Tasks

Reproductions