SOTAVerified

Optimal Spectral Recovery of a Planted Vector in a Subspace

2021-05-31Unverified0· sign in to hype

Cheng Mao, Alexander S. Wein

Unverified — Be the first to reproduce this paper.

Reproduce

Abstract

Recovering a planted vector v in an n-dimensional random subspace of R^N is a generic task related to many problems in machine learning and statistics, such as dictionary learning, subspace recovery, principal component analysis, and non-Gaussian component analysis. In this work, we study computationally efficient estimation and detection of a planted vector v whose _4 norm differs from that of a Gaussian vector with the same _2 norm. For instance, in the special case where v is an N -sparse vector with Bernoulli-Gaussian or Bernoulli-Rademacher entries, our results include the following: (1) We give an improved analysis of a slight variant of the spectral method proposed by Hopkins, Schramm, Shi, and Steurer (2016), showing that it approximately recovers v with high probability in the regime n N. This condition subsumes the conditions 1/n or n N required by previous work up to polylogarithmic factors. We achieve _ error bounds for the spectral estimator via a leave-one-out analysis, from which it follows that a simple thresholding procedure exactly recovers v with Bernoulli-Rademacher entries, even in the dense case = 1. (2) We study the associated detection problem and show that in the regime n N, any spectral method from a large class (and more generally, any low-degree polynomial of the input) fails to detect the planted vector. This matches the condition for recovery and offers evidence that no polynomial-time algorithm can succeed in recovering a Bernoulli-Gaussian vector v when n N.

Tasks

Reproductions