SOTAVerified

Third-order Smoothness Helps: Even Faster Stochastic Optimization Algorithms for Finding Local Minima

2017-12-18Unverified0· sign in to hype

Yaodong Yu, Pan Xu, Quanquan Gu

Unverified — Be the first to reproduce this paper.

Reproduce

Abstract

We propose stochastic optimization algorithms that can find local minima faster than existing algorithms for nonconvex optimization problems, by exploiting the third-order smoothness to escape non-degenerate saddle points more efficiently. More specifically, the proposed algorithm only needs O(^-10/3) stochastic gradient evaluations to converge to an approximate local minimum x, which satisfies \| f(x)\|_2 and _(^2 f(x)) - in the general stochastic optimization setting, where O() hides logarithm polynomial terms and constants. This improves upon the O(^-7/2) gradient complexity achieved by the state-of-the-art stochastic local minima finding algorithms by a factor of O(^-1/6). For nonconvex finite-sum optimization, our algorithm also outperforms the best known algorithms in a certain regime.

Tasks

Reproductions