SOTAVerified

Faster Ridge Regression via the Subsampled Randomized Hadamard Transform

2013-12-01NeurIPS 2013Unverified0· sign in to hype

Yichao Lu, Paramveer Dhillon, Dean P. Foster, Lyle Ungar

Unverified — Be the first to reproduce this paper.

Reproduce

Abstract

We propose a fast algorithm for ridge regression when the number of features is much larger than the number of observations (p n). The standard way to solve ridge regression in this setting works in the dual space and gives a running time of O(n^2p). Our algorithm (SRHT-DRR) runs in time O(np(n)) and works by preconditioning the design matrix by a Randomized Walsh-Hadamard Transform with a subsequent subsampling of features. We provide risk bounds for our SRHT-DRR algorithm in the fixed design setting and show experimental results on synthetic and real datasets.

Tasks

Reproductions