SOTAVerified

Goku’s Participation in WAT 2020

2020-12-01AACL (WAT) 2020Unverified0· sign in to hype

Dongzhe Wang, Ohnmar Htun

Unverified — Be the first to reproduce this paper.

Reproduce

Abstract

This paper introduces our neural machine translation systems’ participation in the WAT 2020 (team ID: goku20). We participated in the (i) Patent, (ii) Business Scene Dialogue (BSD) document-level translation, (iii) Mixed-domain tasks. Regardless of simplicity, standard Transformer models have been proven to be very effective in many machine translation systems. Recently, some advanced pre-training generative models have been proposed on the basis of encoder-decoder framework. Our main focus of this work is to explore how robust Transformer models perform in translation from sentence-level to document-level, from resource-rich to low-resource languages. Additionally, we also investigated the improvement that fine-tuning on the top of pre-trained transformer-based models can achieve on various tasks.

Tasks

Reproductions