SOTAVerified

Multilingual Neural Machine Translation

2020-12-01COLING 2020Unverified0· sign in to hype

Raj Dabre, Chenhui Chu, Anoop Kunchukuttan

Unverified — Be the first to reproduce this paper.

Reproduce

Abstract

The advent of neural machine translation (NMT) has opened up exciting research in building multilingual translation systems i.e. translation models that can handle more than one language pair. Many advances have been made which have enabled (1) improving translation for low-resource languages via transfer learning from high resource languages; and (2) building compact translation models spanning multiple languages. In this tutorial, we will cover the latest advances in NMT approaches that leverage multilingualism, especially to enhance low-resource translation. In particular, we will focus on the following topics: modeling parameter sharing for multi-way models, massively multilingual models, training protocols, language divergence, transfer learning, zero-shot/zero-resource learning, pivoting, multilingual pre-training and multi-source translation.

Tasks

Reproductions