SOTAVerified

Stochastic Subgradient Descent Escapes Active Strict Saddles on Weakly Convex Functions

2021-08-04Unverified0· sign in to hype

Pascal Bianchi, Walid Hachem, Sholom Schechtman

Unverified — Be the first to reproduce this paper.

Reproduce

Abstract

In non-smooth stochastic optimization, we establish the non-convergence of the stochastic subgradient descent (SGD) to the critical points recently called active strict saddles by Davis and Drusvyatskiy. Such points lie on a manifold M where the function f has a direction of second-order negative curvature. Off this manifold, the norm of the Clarke subdifferential of f is lower-bounded. We require two conditions on f. The first assumption is a Verdier stratification condition, which is a refinement of the popular Whitney stratification. It allows us to establish a reinforced version of the projection formula of Bolte et.al. for Whitney stratifiable functions, and which is of independent interest. The second assumption, termed the angle condition, allows to control the distance of the iterates to M. When f is weakly convex, our assumptions are generic. Consequently, generically in the class of definable weakly convex functions, the SGD converges to a local minimizer.

Tasks

Reproductions