Constant Stepsize Local GD for Logistic Regression: Acceleration by Instability
Michael Crawshaw, Blake Woodworth, Mingrui Liu
Unverified — Be the first to reproduce this paper.
ReproduceAbstract
Existing analysis of Local (Stochastic) Gradient Descent for heterogeneous objectives requires stepsizes 1/K where K is the communication interval, which ensures monotonic decrease of the objective. In contrast, we analyze Local Gradient Descent for logistic regression with separable, heterogeneous data using any stepsize > 0. With R communication rounds and M clients, we show convergence at a rate O(1/ K R) after an initial unstable phase lasting for O( K M) rounds. This improves upon the existing O(1/R) rate for general smooth, convex objectives. Our analysis parallels the single machine analysis of~wu2024large in which instability is caused by extremely large stepsizes, but in our setting another source of instability is large local updates with heterogeneous objectives.