SOTAVerified

Efficient Symmetric Norm Regression via Linear Sketching

2019-10-04NeurIPS 2019Unverified0· sign in to hype

Zhao Song, Ruosong Wang, Lin F. Yang, Hongyang Zhang, Peilin Zhong

Unverified — Be the first to reproduce this paper.

Reproduce

Abstract

We provide efficient algorithms for overconstrained linear regression problems with size n d when the loss function is a symmetric norm (a norm invariant under sign-flips and coordinate-permutations). An important class of symmetric norms are Orlicz norms, where for a function G and a vector y R^n, the corresponding Orlicz norm \|y\|_G is defined as the unique value such that _i=1^n G(|y_i|/) = 1. When the loss function is an Orlicz norm, our algorithm produces a (1 + )-approximate solution for an arbitrarily small constant > 0 in input-sparsity time, improving over the previously best-known algorithm which produces a d polylog n-approximate solution. When the loss function is a general symmetric norm, our algorithm produces a d polylog n mmc()-approximate solution in input-sparsity time, where mmc() is a quantity related to the symmetric norm under consideration. To the best of our knowledge, this is the first input-sparsity time algorithm with provable guarantees for the general class of symmetric norm regression problem. Our results shed light on resolving the universal sketching problem for linear regression, and the techniques might be of independent interest to numerical linear algebra problems more broadly.

Tasks

Reproductions