Near Optimal Sketching of Low-Rank Tensor Regression
Jarvis Haupt, Xingguo Li, David P. Woodruff
Unverified — Be the first to reproduce this paper.
ReproduceAbstract
We study the least squares regression problem align* _ S_ D,R \|A -b\|_2, align* where S_ D,R is the set of for which = _r=1^R _1^(r) _D^(r) for vectors _d^(r) R^p_d for all r [R] and d [D], and denotes the outer product of vectors. That is, is a low-dimensional, low-rank tensor. This is motivated by the fact that the number of parameters in is only R _d=1^D p_d, which is significantly smaller than the _d=1^D p_d number of parameters in ordinary least squares regression. We consider the above CP decomposition model of tensors , as well as the Tucker decomposition. For both models we show how to apply data dimensionality reduction techniques based on sparse random projections R^m n, with m n, to reduce the problem to a much smaller problem _ \| A - b\|_2, for which if ' is a near-optimum to the smaller problem, then it is also a near optimum to the original problem. We obtain significantly smaller dimension and sparsity in than is possible for ordinary least squares regression, and we also provide a number of numerical simulations supporting our theory.