Stabilizing Bi-Level Hyperparameter Optimization using Moreau-Yosida Regularization
2020-07-27Code Available1· sign in to hype
Sauptik Dhar, Unmesh Kurup, Mohak Shah
Code Available — Be the first to reproduce this paper.
ReproduceCode
- github.com/lorraine2/hypernet-hypertrainingOfficialIn papernone★ 28
Abstract
This research proposes to use the Moreau-Yosida envelope to stabilize the convergence behavior of bi-level Hyperparameter optimization solvers, and introduces the new algorithm called Moreau-Yosida regularized Hyperparameter Optimization (MY-HPO) algorithm. Theoretical analysis on the correctness of the MY-HPO solution and initial convergence analysis is also provided. Our empirical results show significant improvement in loss values for a fixed computation budget, compared to the state-of-art bi-level HPO solvers.