SOTAVerified

Cross-lingual Transfer Learning with Data Selection for Large-Scale Spoken Language Understanding

2019-11-01IJCNLP 2019Unverified0· sign in to hype

Quynh Do, Judith Gaspers

Unverified — Be the first to reproduce this paper.

Reproduce

Abstract

A typical cross-lingual transfer learning approach boosting model performance on a language is to pre-train the model on all available supervised data from another language. However, in large-scale systems this leads to high training times and computational requirements. In addition, characteristic differences between the source and target languages raise a natural question of whether source data selection can improve the knowledge transfer. In this paper, we address this question and propose a simple but effective language model based source-language data selection method for cross-lingual transfer learning in large-scale spoken language understanding. The experimental results show that with data selection i) source data and hence training speed is reduced significantly and ii) model performance is improved.

Tasks

Reproductions