SOTAVerified

Asymptotic Analysis of Conditioned Stochastic Gradient Descent

2020-06-04Code Available0· sign in to hype

Rémi Leluc, François Portier

Code Available — Be the first to reproduce this paper.

Reproduce

Code

Abstract

In this paper, we investigate a general class of stochastic gradient descent (SGD) algorithms, called Conditioned SGD, based on a preconditioning of the gradient direction. Using a discrete-time approach with martingale tools, we establish under mild assumptions the weak convergence of the rescaled sequence of iterates for a broad class of conditioning matrices including stochastic first-order and second-order methods. Almost sure convergence results, which may be of independent interest, are also presented. Interestingly, the asymptotic normality result consists in a stochastic equicontinuity property so when the conditioning matrix is an estimate of the inverse Hessian, the algorithm is asymptotically optimal.

Tasks

Reproductions