Provable Reduction in Communication Rounds for Non-Smooth Convex Federated Learning
2025-03-27Unverified0· sign in to hype
Karlo Palenzuela, Ali Dadras, Alp Yurtsever, Tommy Löfstedt
Unverified — Be the first to reproduce this paper.
ReproduceAbstract
Multiple local steps are key to communication-efficient federated learning. However, theoretical guarantees for such algorithms, without data heterogeneity-bounding assumptions, have been lacking in general non-smooth convex problems. Leveraging projection-efficient optimization methods, we propose FedMLS, a federated learning algorithm with provable improvements from multiple local steps. FedMLS attains an -suboptimal solution in O(1/) communication rounds, requiring a total of O(1/^2) stochastic subgradient oracle calls.