A Proximal Gradient Method With Probabilistic Multi-Gossip Communications for Decentralized Composite Optimization
Luyao Guo, Luqing Wang, Xinli Shi, Jinde Cao
Unverified — Be the first to reproduce this paper.
ReproduceAbstract
Decentralized optimization methods with local updates have recently gained attention for their provable ability to communication acceleration. In these methods, nodes perform several iterations of local computations between the communication rounds. Nevertheless, this capability is effective only when the loss function is smooth and the network is sufficiently well-connected. In this paper, we propose a communication-efficient method MG-Skip with probabilistic local updates and multi-gossip communications for decentralized composite (smooth + nonsmooth) optimization, whose stepsize is independent of the number of local updates and the network topology. Without any additional condition for network connectivity, MG-Skip allows for the multi-gossip communications to be skipped in most iterations in the strongly convex setting, while its iteration complexity is O( 1) and communication complexity is only O((1-) 1), where is the condition number of the loss function, reflects the connectivity of the network topology, and is the target accuracy. The theoretical results demonstrate that MG-Skip achieves the optimal communication complexity and confirm the benefits of local updates in the nonsmooth setup.