Low-Degree Method Fails to Predict Robust Subspace Recovery
He Jia, Aravindan Vijayaraghavan
Unverified — Be the first to reproduce this paper.
ReproduceAbstract
The low-degree polynomial framework has been highly successful in predicting computational versus statistical gaps for high-dimensional problems in average-case analysis and machine learning. This success has led to the low-degree conjecture, which posits that this method captures the power and limitations of efficient algorithms for a wide class of high-dimensional statistical problems. We identify a natural and basic hypothesis testing problem in R^n which is polynomial time solvable, but for which the low-degree polynomial method fails to predict its computational tractability even up to degree k=n^Ω(1). Moreover, the low-degree moments match exactly up to degree k=O( n/ n). Our problem is a special case of the well-studied robust subspace recovery problem. The lower bounds suggest that there is no polynomial time algorithm for this problem. In contrast, we give a simple and robust polynomial time algorithm that solves the problem (and noisy variants of it), leveraging anti-concentration properties of the distribution. Our results suggest that the low-degree method and low-degree moments fail to capture algorithms based on anti-concentration, challenging their universality as a predictor of computational barriers.