Asynchronous decentralized accelerated stochastic gradient descent
2018-09-24Unverified0· sign in to hype
Guanghui Lan, Yi Zhou
Unverified — Be the first to reproduce this paper.
ReproduceAbstract
In this work, we introduce an asynchronous decentralized accelerated stochastic gradient descent type of method for decentralized stochastic optimization, considering communication and synchronization are the major bottlenecks. We establish O(1/) (resp., O(1/)) communication complexity and O(1/^2) (resp., O(1/)) sampling complexity for solving general convex (resp., strongly convex) problems.