SOTAVerified

Fast Sparse Least-Squares Regression with Non-Asymptotic Guarantees

2015-07-18Unverified0· sign in to hype

Tianbao Yang, Lijun Zhang, Qihang Lin, Rong Jin

Unverified — Be the first to reproduce this paper.

Reproduce

Abstract

In this paper, we study a fast approximation method for large-scale high-dimensional sparse least-squares regression problem by exploiting the Johnson-Lindenstrauss (JL) transforms, which embed a set of high-dimensional vectors into a low-dimensional space. In particular, we propose to apply the JL transforms to the data matrix and the target vector and then to solve a sparse least-squares problem on the compressed data with a slightly larger regularization parameter. Theoretically, we establish the optimization error bound of the learned model for two different sparsity-inducing regularizers, i.e., the elastic net and the _1 norm. Compared with previous relevant work, our analysis is non-asymptotic and exhibits more insights on the bound, the sample complexity and the regularization. As an illustration, we also provide an error bound of the Dantzig selector under JL transforms.

Tasks

Reproductions