SOTAVerified

Communication-Efficient Federated Learning via Predictive Coding

2021-08-02Code Available0· sign in to hype

Kai Yue, Richeng Jin, Chau-Wai Wong, Huaiyu Dai

Code Available — Be the first to reproduce this paper.

Reproduce

Code

Abstract

Federated learning can enable remote workers to collaboratively train a shared machine learning model while allowing training data to be kept locally. In the use case of wireless mobile devices, the communication overhead is a critical bottleneck due to limited power and bandwidth. Prior work has utilized various data compression tools such as quantization and sparsification to reduce the overhead. In this paper, we propose a predictive coding based compression scheme for federated learning. The scheme has shared prediction functions among all devices and allows each worker to transmit a compressed residual vector derived from the reference. In each communication round, we select the predictor and quantizer based on the rate-distortion cost, and further reduce the redundancy with entropy coding. Extensive simulations reveal that the communication cost can be reduced up to 99% with even better learning performance when compared with other baseline methods.

Tasks

Reproductions