Practical Transfer Learning for Bayesian Optimization
Matthias Feurer, Benjamin Letham, Frank Hutter, Eytan Bakshy
Code Available — Be the first to reproduce this paper.
ReproduceCode
- github.com/automl/transfer-hpo-frameworkOfficialIn papernone★ 0
- github.com/mfeurer/rgpe-code-releasenone★ 0
Abstract
When hyperparameter optimization of a machine learning algorithm is repeated for multiple datasets it is possible to transfer knowledge to an optimization run on a new dataset. We develop a new hyperparameter-free ensemble model for Bayesian optimization that is a generalization of two existing transfer learning extensions to Bayesian optimization and establish a worst-case bound compared to vanilla Bayesian optimization. Using a large collection of hyperparameter optimization benchmark problems, we demonstrate that our contributions substantially reduce optimization time compared to standard Gaussian process-based Bayesian optimization and improve over the current state-of-the-art for transfer hyperparameter optimization.