SOTAVerified

Symmetry, Saddle Points, and Global Optimization Landscape of Nonconvex Matrix Factorization

2016-12-29Unverified0· sign in to hype

Xingguo Li, Junwei Lu, Raman Arora, Jarvis Haupt, Han Liu, Zhaoran Wang, Tuo Zhao

Unverified — Be the first to reproduce this paper.

Reproduce

Abstract

We propose a general theory for studying the landscape of nonconvex optimization with underlying symmetric structures for a class of machine learning problems (e.g., low-rank matrix factorization, phase retrieval, and deep linear neural networks). In specific, we characterize the locations of stationary points and the null space of Hessian matrices of the objective function via the lens of invariant groupsfor associated optimization problems, including low-rank matrix factorization, phase retrieval, and deep linear neural networks. As a major motivating example, we apply the proposed general theory to characterize the global landscape of the nonconvex optimization in low-rank matrix factorization problem. In particular, we illustrate how the rotational symmetry group gives rise to infinitely many nonisolated strict saddle points and equivalent global minima of the objective function. By explicitly identifying all stationary points, we divide the entire parameter space into three regions: (_1) the region containing the neighborhoods of all strict saddle points, where the objective has negative curvatures; (_2) the region containing neighborhoods of all global minima, where the objective enjoys strong convexity along certain directions; and (_3) the complement of the above regions, where the gradient has sufficiently large magnitudes. We further extend our result to the matrix sensing problem. Such global landscape implies strong global convergence guarantees for popular iterative algorithms with arbitrary initial solutions.

Tasks

Reproductions