SOTAVerified

Sarah's Participation in WAT 2019

2019-11-01WS 2019Unverified0· sign in to hype

Raymond Hendy Susanto, Ohnmar Htun, Liling Tan

Unverified — Be the first to reproduce this paper.

Reproduce

Abstract

This paper describes our MT systems' participation in the of WAT 2019. We participated in the (i) Patent, (ii) Timely Disclosure, (iii) Newswire and (iv) Mixed-domain tasks. Our main focus is to explore how similar Transformer models perform on various tasks. We observed that for tasks with smaller datasets, our best model setup are shallower models with lesser number of attention heads. We investigated practical issues in NMT that often appear in production settings, such as coping with multilinguality and simplifying pre- and post-processing pipeline in deployment.

Tasks

Reproductions