SOTAVerified

Global Convergence of Iteratively Reweighted Least Squares for Robust Subspace Recovery

2025-06-25Code Available0· sign in to hype

Gilad Lerman, Kang Li, Tyler Maunu, Teng Zhang

Code Available — Be the first to reproduce this paper.

Reproduce

Code

Abstract

Robust subspace estimation is fundamental to many machine learning and data analysis tasks. Iteratively Reweighted Least Squares (IRLS) is an elegant and empirically effective approach to this problem, yet its theoretical properties remain poorly understood. This paper establishes that, under deterministic conditions, a variant of IRLS with dynamic smoothing regularization converges linearly to the underlying subspace from any initialization. We extend these guarantees to affine subspace estimation, a setting that lacks prior recovery theory. Additionally, we illustrate the practical benefits of IRLS through an application to low-dimensional neural network training. Our results provide the first global convergence guarantees for IRLS in robust subspace recovery and, more broadly, for nonconvex IRLS on a Riemannian manifold.

Reproductions