Faster Gradient-Free Algorithms for Nonsmooth Nonconvex Stochastic Optimization
2023-01-16Unverified0· sign in to hype
Lesi Chen, Jing Xu, Luo Luo
Unverified — Be the first to reproduce this paper.
ReproduceAbstract
We consider the optimization problem of the form _x R^d f(x) E_ [F(x; )], where the component F(x;) is L-mean-squared Lipschitz but possibly nonconvex and nonsmooth. The recently proposed gradient-free method requires at most O( L^4 d^3/2 ^-4 + L^3 d^3/2 ^-1 ^-4) stochastic zeroth-order oracle complexity to find a (,)-Goldstein stationary point of objective function, where = f(x_0) - _x R^d f(x) and x_0 is the initial point of the algorithm. This paper proposes a more efficient algorithm using stochastic recursive gradient estimators, which improves the complexity to O(L^3 d^3/2 ^-3+ L^2 d^3/2 ^-1 ^-3).