SOTAVerified

CMA-ES for Hyperparameter Optimization of Deep Neural Networks

2016-04-25Unverified0· sign in to hype

Ilya Loshchilov, Frank Hutter

Unverified — Be the first to reproduce this paper.

Reproduce

Abstract

Hyperparameters of deep neural networks are often optimized by grid search, random search or Bayesian optimization. As an alternative, we propose to use the Covariance Matrix Adaptation Evolution Strategy (CMA-ES), which is known for its state-of-the-art performance in derivative-free optimization. CMA-ES has some useful invariance properties and is friendly to parallel evaluations of solutions. We provide a toy example comparing CMA-ES and state-of-the-art Bayesian optimization algorithms for tuning the hyperparameters of a convolutional neural network for the MNIST dataset on 30 GPUs in parallel.

Tasks

Reproductions