SOTAVerified

Effects of Layer Freezing on Transferring a Speech Recognition System to Under-resourced Languages

2021-02-08KONVENS (WS) 2021Code Available0· sign in to hype

Onno Eberhard, Torsten Zesch

Code Available — Be the first to reproduce this paper.

Reproduce

Code

Abstract

In this paper, we investigate the effect of layer freezing on the effectiveness of model transfer in the area of automatic speech recognition. We experiment with Mozilla's DeepSpeech architecture on German and Swiss German speech datasets and compare the results of either training from scratch vs. transferring a pre-trained model. We compare different layer freezing schemes and find that even freezing only one layer already significantly improves results.

Tasks

Reproductions