SOTAVerified

Variance-Reduced Decentralized Stochastic Optimization with Gradient Tracking -- Part II: GT-SVRG

2019-10-08Unverified0· sign in to hype

Ran Xin, Usman A. Khan, Soummya Kar

Unverified — Be the first to reproduce this paper.

Reproduce

Abstract

Decentralized stochastic optimization has recently benefited from gradient tracking methods DSGT_Pu,DSGT_Xin providing efficient solutions for large-scale empirical risk minimization problems. In Part I GT_SAGA of this work, we develop GT-SAGA that is based on a decentralized implementation of SAGA SAGA using gradient tracking and discuss regimes of practical interest where GT-SAGA outperforms existing decentralized approaches in terms of the total number of local gradient computations. In this paper, we describe GT-SVRG that develops a decentralized gradient tracking based implementation of SVRG SVRG, another well-known variance-reduction technique. We show that the convergence rate of GT-SVRG matches that of GT-SAGA for smooth and strongly-convex functions and highlight different trade-offs between the two algorithms in various settings.

Tasks

Reproductions