SOTAVerified

Error Lower Bounds of Constant Step-size Stochastic Gradient Descent

2019-10-18Unverified0· sign in to hype

Zhiyan Ding, Yiding Chen, Qin Li, Xiaojin Zhu

Unverified — Be the first to reproduce this paper.

Reproduce

Abstract

Stochastic Gradient Descent (SGD) plays a central role in modern machine learning. While there is extensive work on providing error upper bound for SGD, not much is known about SGD error lower bound. In this paper, we study the convergence of constant step-size SGD. We provide error lower bound of SGD for potentially non-convex objective functions with Lipschitz gradients. To our knowledge, this is the first analysis for SGD error lower bound without the strong convexity assumption. We use experiments to illustrate our theoretical results.

Tasks

Reproductions