SOTAVerified

Transfer Learning for Related Languages: Submissions to the WMT20 Similar Language Translation Task

2020-11-01WMT (EMNLP) 2020Unverified0· sign in to hype

Lovish Madaan, Soumya Sharma, Parag Singla

Unverified — Be the first to reproduce this paper.

Reproduce

Abstract

In this paper, we describe IIT Delhi’s submissions to the WMT 2020 task on Similar Language Translation for four language directions: Hindi <-> Marathi and Spanish <-> Portuguese. We try out three different model settings for the translation task and select our primary and contrastive submissions on the basis of performance of these three models. For our best submissions, we fine-tune the mBART model on the parallel data provided for the task. The pre-training is done using self-supervised objectives on a large amount of monolingual data for many languages. Overall, our models are ranked in the top four of all systems for the submitted language pairs, with first rank in Spanish -> Portuguese.

Tasks

Reproductions