SOTAVerified

Does Momentum Help? A Sample Complexity Analysis

2021-10-29Unverified0· sign in to hype

Swetha Ganesh, Rohan Deb, Gugan Thoppe, Amarjit Budhiraja

Unverified — Be the first to reproduce this paper.

Reproduce

Abstract

Stochastic Heavy Ball (SHB) and Nesterov's Accelerated Stochastic Gradient (ASG) are popular momentum methods in stochastic optimization. While benefits of such acceleration ideas in deterministic settings are well understood, their advantages in stochastic optimization is still unclear. In fact, in some specific instances, it is known that momentum does not help in the sample complexity sense. Our work shows that a similar outcome actually holds for the whole of quadratic optimization. Specifically, we obtain a lower bound on the sample complexity of SHB and ASG for this family and show that the same bound can be achieved by the vanilla SGD. We note that there exist results claiming the superiority of momentum based methods in quadratic optimization, but these are based on one-sided or flawed analyses.

Tasks

Reproductions