SOTAVerified

Efficient and High-Quality Neural Machine Translation with OpenNMT

2020-07-01WS 2020Unverified0· sign in to hype

Guillaume Klein, Dakun Zhang, Cl{\'e}ment Chouteau, Josep Crego, Jean Senellart

Unverified — Be the first to reproduce this paper.

Reproduce

Abstract

This paper describes the OpenNMT submissions to the WNGT 2020 efficiency shared task. We explore training and acceleration of Transformer models with various sizes that are trained in a teacher-student setup. We also present a custom and optimized C++ inference engine that enables fast CPU and GPU decoding with few dependencies. By combining additional optimizations and parallelization techniques, we create small, efficient, and high-quality neural machine translation models.

Tasks

Reproductions