SOTAVerified

Natasha: Faster Non-Convex Stochastic Optimization Via Strongly Non-Convex Parameter

2017-02-02ICML 2017Unverified0· sign in to hype

Zeyuan Allen-Zhu

Unverified — Be the first to reproduce this paper.

Reproduce

Abstract

Given a nonconvex function that is an average of n smooth functions, we design stochastic first-order methods to find its approximate stationary points. The convergence of our new methods depends on the smallest (negative) eigenvalue - of the Hessian, a parameter that describes how nonconvex the function is. Our methods outperform known results for a range of parameter , and can be used to find approximate local minima. Our result implies an interesting dichotomy: there exists a threshold _0 so that the currently fastest methods for >_0 and for <_0 have different behaviors: the former scales with n^2/3 and the latter scales with n^3/4.

Tasks

Reproductions