Subspace Embedding and Linear Regression with Orlicz Norm
Alexandr Andoni, Chengyu Lin, Ying Sheng, Peilin Zhong, Ruiqi Zhong
Unverified — Be the first to reproduce this paper.
ReproduceAbstract
We consider a generalization of the classic linear regression problem to the case when the loss is an Orlicz norm. An Orlicz norm is parameterized by a non-negative convex function G:R_+R_+ with G(0)=0: the Orlicz norm of a vector xR^n is defined as \|x\|_G=\>0_i=1^n G(|x_i|/) 1\. We consider the cases where the function G() grows subquadratically. Our main result is based on a new oblivious embedding which embeds the column space of a given matrix AR^n d with Orlicz norm into a lower dimensional space with _2 norm. Specifically, we show how to efficiently find an embedding matrix SR^m n,m<n such that xR^d,(1/(d n)) \|Ax\|_G \|SAx\|_2 O(d^2 n) \|Ax\|_G. By applying this subspace embedding technique, we show an approximation algorithm for the regression problem _xR^d \|Ax-b\|_G, up to a O(d^2 n) factor. As a further application of our techniques, we show how to also use them to improve on the algorithm for the _p low rank matrix approximation problem for 1 p<2.