SOTAVerified

Unsupervised Multilingual Alignment using Wasserstein Barycenter

2020-01-28Code Available0· sign in to hype

Xin Lian, Kshitij Jain, Jakub Truszkowski, Pascal Poupart, Yao-Liang Yu

Code Available — Be the first to reproduce this paper.

Reproduce

Code

Abstract

We study unsupervised multilingual alignment, the problem of finding word-to-word translations between multiple languages without using any parallel data. One popular strategy is to reduce multilingual alignment to the much simplified bilingual setting, by picking one of the input languages as the pivot language that we transit through. However, it is well-known that transiting through a poorly chosen pivot language (such as English) may severely degrade the translation quality, since the assumed transitive relations among all pairs of languages may not be enforced in the training process. Instead of going through a rather arbitrarily chosen pivot language, we propose to use the Wasserstein barycenter as a more informative "mean" language: it encapsulates information from all languages and minimizes all pairwise transportation costs. We evaluate our method on standard benchmarks and demonstrate state-of-the-art performances.

Tasks

Benchmark Results

DatasetModelMetricClaimedVerifiedStatus
en-esBarycenter AlignmentP@184.26Unverified
en-frBarycenter AlignmentP@182.94Unverified
en-itBarycenter AlignmentP@181.45Unverified
es-enBarycenter AlignmentP@183.5Unverified
fr-enBarycenter AlignmentP@183.23Unverified
MUSE en-deBarycenter AlignmentP@174.08Unverified
MUSE en-ptBarycenter AlignmentP@184.65Unverified

Reproductions