SOTAVerified

Combination of Neural Machine Translation Systems at WMT20

2020-11-01WMT (EMNLP) 2020Unverified0· sign in to hype

Benjamin Marie, Raphael Rubino, Atsushi Fujita

Unverified — Be the first to reproduce this paper.

Reproduce

Abstract

This paper presents neural machine translation systems and their combination built for the WMT20 English-Polish and Japanese->English translation tasks. We show that using a Transformer Big architecture, additional training data synthesized from monolingual data, and combining many NMT systems through n-best list reranking improve translation quality. However, while we observed such improvements on the validation data, we did not observed similar improvements on the test data. Our analysis reveals that the presence of translationese texts in the validation data led us to take decisions in building NMT systems that were not optimal to obtain the best results on the test data.

Tasks

Reproductions