SOTAVerified

Neural Machine Translation with the Transformer and Multi-Source Romance Languages for the Biomedical WMT 2018 task

2018-10-01WS 2018Unverified0· sign in to hype

Brian Tubay, Marta R. Costa-juss{\`a}

Unverified — Be the first to reproduce this paper.

Reproduce

Abstract

The Transformer architecture has become the state-of-the-art in Machine Translation. This model, which relies on attention-based mechanisms, has outperformed previous neural machine translation architectures in several tasks. In this system description paper, we report details of training neural machine translation with multi-source Romance languages with the Transformer model and in the evaluation frame of the biomedical WMT 2018 task. Using multi-source languages from the same family allows improvements of over 6 BLEU points.

Tasks

Reproductions