SOTAVerified

Enhancing Stochastic Gradient Descent: A Unified Framework and Novel Acceleration Methods for Faster Convergence

2024-02-02Unverified0· sign in to hype

Yichuan Deng, Zhao Song, Chiwun Yang

Unverified — Be the first to reproduce this paper.

Reproduce

Abstract

Based on SGD, previous works have proposed many algorithms that have improved convergence speed and generalization in stochastic optimization, such as SGDm, AdaGrad, Adam, etc. However, their convergence analysis under non-convex conditions is challenging. In this work, we propose a unified framework to address this issue. For any first-order methods, we interpret the updated direction g_t as the sum of the stochastic subgradient f_t(x_t) and an additional acceleration term 2| v_t, f_t(x_t) |\|v_t\|_2^2 v_t, thus we can discuss the convergence by analyzing v_t, f_t(x_t) . Through our framework, we have discovered two plug-and-play acceleration methods: Reject Accelerating and Random Vector Accelerating, we theoretically demonstrate that these two methods can directly lead to an improvement in convergence rate.

Tasks

Reproductions