SOTAVerified

On Romanization for Model Transfer Between Scripts in Neural Machine Translation

2020-09-30Findings of the Association for Computational LinguisticsUnverified0· sign in to hype

Chantal Amrhein, Rico Sennrich

Unverified — Be the first to reproduce this paper.

Reproduce

Abstract

Transfer learning is a popular strategy to improve the quality of low-resource machine translation. For an optimal transfer of the embedding layer, the child and parent model should share a substantial part of the vocabulary. This is not the case when transferring to languages with a different script. We explore the benefit of romanization in this scenario. Our results show that romanization entails information loss and is thus not always superior to simpler vocabulary transfer methods, but can improve the transfer between related languages with different scripts. We compare two romanization tools and find that they exhibit different degrees of information loss, which affects translation quality. Finally, we extend romanization to the target side, showing that this can be a successful strategy when coupled with a simple deromanization model.

Tasks

Reproductions