SOTAVerified

Krylov Methods are (nearly) Optimal for Low-Rank Approximation

2023-04-06Unverified0· sign in to hype

Ainesh Bakshi, Shyam Narayanan

Unverified — Be the first to reproduce this paper.

Reproduce

Abstract

We consider the problem of rank-1 low-rank approximation (LRA) in the matrix-vector product model under various Schatten norms: where \|M\|_S_p denotes the _p norm of the singular values of M. Given >0, our goal is to output a unit vector v such that Our main result shows that Krylov methods (nearly) achieve the information-theoretically optimal number of matrix-vector products for Spectral (p=), Frobenius (p=2) and Nuclear (p=1) LRA. In particular, for Spectral LRA, we show that any algorithm requires ((n)/^1/2) matrix-vector products, exactly matching the upper bound obtained by Krylov methods [MM15, BCW22]. Our lower bound addresses Open Question 1 in [Woo14], providing evidence for the lack of progress on algorithms for Spectral LRA and resolves Open Question 1.2 in [BCW22]. Next, we show that for any fixed constant p, i.e. 1 p =O(1), there is an upper bound of O((1/)/^1/3) matrix-vector products, implying that the complexity does not grow as a function of input size. This improves the O((n/)/^1/3) bound recently obtained in [BCW22], and matches their (1/^1/3) lower bound, to a (1/) factor.

Tasks

Reproductions