SOTAVerified

Filtering Noisy Parallel Corpus using Transformers with Proxy Task Learning

2020-11-01WMT (EMNLP) 2020Code Available1· sign in to hype

Haluk Açarçiçek, Talha Çolakoğlu, Pınar Ece Aktan Hatipoğlu, Chong Hsuan Huang, Wei Peng

Code Available — Be the first to reproduce this paper.

Reproduce

Code

Abstract

This paper illustrates Huawei’s submission to the WMT20 low-resource parallel corpus filtering shared task. Our approach focuses on developing a proxy task learner on top of a transformer-based multilingual pre-trained language model to boost the filtering capability for noisy parallel corpora. Such a supervised task also helps us to iterate much more quickly than using an existing neural machine translation system to perform the same task. After performing empirical analyses of the finetuning task, we benchmark our approach by comparing the results with past years’ state-of-theart records. This paper wraps up with a discussion of limitations and future work. The scripts for this study will be made publicly available.

Tasks

Reproductions