SOTAVerified

Linear Convergent Decentralized Optimization with Compression

2020-07-01ICLR 2021Unverified0· sign in to hype

Xiaorui Liu, Yao Li, Rongrong Wang, Jiliang Tang, Ming Yan

Unverified — Be the first to reproduce this paper.

Reproduce

Abstract

Communication compression has become a key strategy to speed up distributed optimization. However, existing decentralized algorithms with compression mainly focus on compressing DGD-type algorithms. They are unsatisfactory in terms of convergence rate, stability, and the capability to handle heterogeneous data. Motivated by primal-dual algorithms, this paper proposes the first LinEAr convergent Decentralized algorithm with compression, LEAD. Our theory describes the coupled dynamics of the inexact primal and dual update as well as compression error, and we provide the first consensus error bound in such settings without assuming bounded gradients. Experiments on convex problems validate our theoretical analysis, and empirical study on deep neural nets shows that LEAD is applicable to non-convex problems.

Tasks

Reproductions