[2603.02594] Low-Degree Method Fails to Predict Robust Subspace Recovery
About this article
Abstract page for arXiv paper 2603.02594: Low-Degree Method Fails to Predict Robust Subspace Recovery
Statistics > Machine Learning arXiv:2603.02594 (stat) [Submitted on 3 Mar 2026] Title:Low-Degree Method Fails to Predict Robust Subspace Recovery Authors:He Jia, Aravindan Vijayaraghavan View a PDF of the paper titled Low-Degree Method Fails to Predict Robust Subspace Recovery, by He Jia and Aravindan Vijayaraghavan View PDF HTML (experimental) Abstract:The low-degree polynomial framework has been highly successful in predicting computational versus statistical gaps for high-dimensional problems in average-case analysis and machine learning. This success has led to the low-degree conjecture, which posits that this method captures the power and limitations of efficient algorithms for a wide class of high-dimensional statistical problems. We identify a natural and basic hypothesis testing problem in $\mathbb{R}^n$ which is polynomial time solvable, but for which the low-degree polynomial method fails to predict its computational tractability even up to degree $k=n^{\Omega(1)}$. Moreover, the low-degree moments match exactly up to degree $k=O(\sqrt{\log n/\log\log n})$. Our problem is a special case of the well-studied robust subspace recovery problem. The lower bounds suggest that there is no polynomial time algorithm for this problem. In contrast, we give a simple and robust polynomial time algorithm that solves the problem (and noisy variants of it), leveraging anti-concentration properties of the distribution. Our results suggest that the low-degree method and low-degree ...