Sparse Regression at Scale: Branch-and-Bound rooted in First-Order Optimization
Hussein Hazimeh, Rahul Mazumder, Ali Saab
Code Available — Be the first to reproduce this paper.
ReproduceCode
- github.com/alisaab/l0bnbOfficialIn papernone★ 45
- github.com/rahulmaz/L0BnBnone★ 1
Abstract
We consider the least squares regression problem, penalized with a combination of the _0 and squared _2 penalty functions (a.k.a. _0 _2 regularization). Recent work shows that the resulting estimators are of key importance in many high-dimensional statistical settings. However, exact computation of these estimators remains a major challenge. Indeed, modern exact methods, based on mixed integer programming (MIP), face difficulties when the number of features p 10^4. In this work, we present a new exact MIP framework for _0_2-regularized regression that can scale to p 10^7, achieving speedups of at least 5000x, compared to state-of-the-art exact methods. Unlike recent work, which relies on modern commercial MIP solvers, we design a specialized nonlinear branch-and-bound (BnB) framework, by critically exploiting the problem structure. A key distinguishing component in our framework lies in efficiently solving the node relaxations using a specialized first-order method, based on coordinate descent (CD). Our CD-based method effectively leverages information across the BnB nodes, through using warm starts, active sets, and gradient screening. In addition, we design a novel method for obtaining dual bounds from primal CD solutions, which certifiably works in high dimensions. Experiments on synthetic and real high-dimensional datasets demonstrate that our framework is not only significantly faster than the state of the art, but can also deliver certifiably optimal solutions to statistically challenging instances that cannot be handled with existing methods. We open source the implementation through our toolkit L0BnB.