Transfer learning for music classification and regression tasks
Keunwoo Choi, György Fazekas, Mark Sandler, Kyunghyun Cho
Code Available — Be the first to reproduce this paper.
ReproduceCode
- github.com/keunwoochoi/transfer_learning_musicOfficialIn papernone★ 0
- github.com/eatsleepraverepeat/emusic_nettf★ 0
- github.com/DaCreasy/DLFinalnone★ 0
Abstract
In this paper, we present a transfer learning approach for music classification and regression tasks. We propose to use a pre-trained convnet feature, a concatenated feature vector using the activations of feature maps of multiple layers in a trained convolutional network. We show how this convnet feature can serve as general-purpose music representation. In the experiments, a convnet is trained for music tagging and then transferred to other music-related classification and regression tasks. The convnet feature outperforms the baseline MFCC feature in all the considered tasks and several previous approaches that are aggregating MFCCs as well as low- and high-level music features.