SOTAVerified

A Large-Scale Study of Language Models for Chord Prediction

2018-04-05Unverified0· sign in to hype

Filip Korzeniowski, David R. W. Sears, Gerhard Widmer

Unverified — Be the first to reproduce this paper.

Reproduce

Abstract

We conduct a large-scale study of language models for chord prediction. Specifically, we compare N-gram models to various flavours of recurrent neural networks on a comprehensive dataset comprising all publicly available datasets of annotated chords known to us. This large amount of data allows us to systematically explore hyper-parameter settings for the recurrent neural networks---a crucial step in achieving good results with this model class. Our results show not only a quantitative difference between the models, but also a qualitative one: in contrast to static N-gram models, certain RNN configurations adapt to the songs at test time. This finding constitutes a further step towards the development of chord recognition systems that are more aware of local musical context than what was previously possible.

Tasks

Reproductions