Matrix Completion in Almost-Verification Time
Jonathan A. Kelner, Jerry Li, Allen Liu, Aaron Sidford, Kevin Tian
Unverified — Be the first to reproduce this paper.
ReproduceAbstract
We give a new framework for solving the fundamental problem of low-rank matrix completion, i.e., approximating a rank-r matrix M R^m n (where m n) from random observations. First, we provide an algorithm which completes M on 99\% of rows and columns under no further assumptions on M from mr samples and using mr^2 time. Then, assuming the row and column spans of M satisfy additional regularity properties, we show how to boost this partial completion guarantee to a full matrix completion algorithm by aggregating solutions to regression problems involving the observations. In the well-studied setting where M has incoherent row and column spans, our algorithms complete M to high precision from mr^2+o(1) observations in mr^3 + o(1) time (omitting logarithmic factors in problem parameters), improving upon the prior state-of-the-art [JN15] which used mr^5 samples and mr^7 time. Under an assumption on the row and column spans of M we introduce (which is satisfied by random subspaces with high probability), our sample complexity improves to an almost information-theoretically optimal mr^1 + o(1), and our runtime improves to mr^2 + o(1). Our runtimes have the appealing property of matching the best known runtime to verify that a rank-r decomposition UV^ agrees with the sampled observations. We also provide robust variants of our algorithms that, given random observations from M + N with \|N\|_F , complete M to Frobenius norm distance r^1.5 in the same runtimes as the noiseless setting. Prior noisy matrix completion algorithms [CP10] only guaranteed a distance of n.