SOTAVerified

Finding Local Minima via Stochastic Nested Variance Reduction

2018-06-22Unverified0· sign in to hype

Dongruo Zhou, Pan Xu, Quanquan Gu

Unverified — Be the first to reproduce this paper.

Reproduce

Abstract

We propose two algorithms that can find local minima faster than the state-of-the-art algorithms in both finite-sum and general stochastic nonconvex optimization. At the core of the proposed algorithms is One-epoch-SNVRG^+ using stochastic nested variance reduction (Zhou et al., 2018a), which outperforms the state-of-the-art variance reduction algorithms such as SCSG (Lei et al., 2017). In particular, for finite-sum optimization problems, the proposed SNVRG^++Neon2^finite algorithm achieves O(n^1/2^-2+n_H^-3+n^3/4_H^-7/2) gradient complexity to converge to an (, _H)-second-order stationary point, which outperforms SVRG+Neon2^finite (Allen-Zhu and Li, 2017) , the best existing algorithm, in a wide regime. For general stochastic optimization problems, the proposed SNVRG^++Neon2^online achieves O(^-3+_H^-5+^-2_H^-3) gradient complexity, which is better than both SVRG+Neon2^online (Allen-Zhu and Li, 2017) and Natasha2 (Allen-Zhu, 2017) in certain regimes. Furthermore, we explore the acceleration brought by third-order smoothness of the objective function.

Tasks

Reproductions