Distributed Gradient Descent with Coded Partial Gradient Computations
2018-11-22Unverified0· sign in to hype
Emre Ozfatura, Sennur Ulukus, Deniz Gunduz
Unverified — Be the first to reproduce this paper.
ReproduceAbstract
Coded computation techniques provide robustness against straggling servers in distributed computing, with the following limitations: First, they increase decoding complexity. Second, they ignore computations carried out by straggling servers; and they are typically designed to recover the full gradient, and thus, cannot provide a balance between the accuracy of the gradient and per-iteration completion time. Here we introduce a hybrid approach, called coded partial gradient computation (CPGC), that benefits from the advantages of both coded and uncoded computation schemes, and reduces both the computation time and decoding complexity.