Improving Code-switching Language Modeling with Artificially Generated Texts using Cycle-consistent Adversarial Networks
2021-12-12Unverified0· sign in to hype
Chia-Yu Li, Ngoc Thang Vu
Unverified — Be the first to reproduce this paper.
ReproduceAbstract
This paper presents our latest effort on improving Code-switching language models that suffer from data scarcity. We investigate methods to augment Code-switching training text data by artificially generating them. Concretely, we propose a cycle-consistent adversarial networks based framework to transfer monolingual text into Code-switching text, considering Code-switching as a speaking style. Our experimental results on the SEAME corpus show that utilising artificially generated Code-switching text data improves consistently the language model as well as the automatic speech recognition performance.