SOTAVerified

Incorporating a Local Translation Mechanism into Non-autoregressive Translation

2020-11-12EMNLP 2020Code Available0· sign in to hype

Xiang Kong, Zhisong Zhang, Eduard Hovy

Code Available — Be the first to reproduce this paper.

Reproduce

Code

Abstract

In this work, we introduce a novel local autoregressive translation (LAT) mechanism into non-autoregressive translation (NAT) models so as to capture local dependencies among tar-get outputs. Specifically, for each target decoding position, instead of only one token, we predict a short sequence of tokens in an autoregressive way. We further design an efficient merging algorithm to align and merge the out-put pieces into one final output sequence. We integrate LAT into the conditional masked language model (CMLM; Ghazvininejad et al.,2019) and similarly adopt iterative decoding. Empirical results on five translation tasks show that compared with CMLM, our method achieves comparable or better performance with fewer decoding iterations, bringing a 2.5xspeedup. Further analysis indicates that our method reduces repeated translations and performs better at longer sentences.

Tasks

Benchmark Results

DatasetModelMetricClaimedVerifiedStatus
WMT2014 English-GermanCMLM+LAT+4 iterationsBLEU score27.35Unverified
WMT2014 English-GermanCMLM+LAT+1 iterationsBLEU score25.2Unverified
WMT2014 German-EnglishCMLM+LAT+4 iterationsBLEU score32.04Unverified
WMT2014 German-EnglishCMLM+LAT+1 iterationsBLEU score29.91Unverified
WMT2016 English-RomanianCMLM+LAT+1 iterationsBLEU score30.74Unverified
WMT2016 English-RomanianCMLM+LAT+4 iterationsBLEU score32.87Unverified
WMT2016 Romanian-EnglishCMLM+LAT+4 iterationsBLEU score33.26Unverified
WMT2016 Romanian-EnglishCMLM+LAT+1 iterationsBLEU score31.24Unverified

Reproductions