SOTAVerified

The power of small initialization in noisy low-tubal-rank tensor recovery

2026-03-03Unverified0· sign in to hype

ZHiyu Liu, Haobo Geng, Xudong Wang, Yandong Tang, Zhi Han, Yao Wang

Unverified — Be the first to reproduce this paper.

Reproduce

Abstract

We study the problem of recovering a low-tubal-rank tensor X\_ R^n n k from noisy linear measurements under the t-product framework. A widely adopted strategy involves factorizing the optimization variable as U * U^, where U R^n R k, followed by applying factorized gradient descent (FGD) to solve the resulting optimization problem. Since the tubal-rank r of the underlying tensor X_ is typically unknown, this method often assumes r < R n, a regime known as over-parameterization. However, when the measurements are corrupted by some dense noise (e.g., Gaussian noise), FGD with the commonly used spectral initialization yields a recovery error that grows linearly with the over-estimated tubal-rank R. To address this issue, we show that using a small initialization enables FGD to achieve a nearly minimax optimal recovery error, even when the tubal-rank R is significantly overestimated. Using a four-stage analytic framework, we analyze this phenomenon and establish the sharpest known error bound to date, which is independent of the overestimated tubal-rank R. Furthermore, we provide a theoretical guarantee showing that an easy-to-use early stopping strategy can achieve the best known result in practice. All these theoretical findings are validated through a series of simulations and real-data experiments.

Reproductions