Wireless Federated Distillation for Distributed Edge Learning with Heterogeneous Data
2019-07-05Unverified0· sign in to hype
Jin-Hyun Ahn, Osvaldo Simeone, Joonhyuk Kang
Unverified — Be the first to reproduce this paper.
ReproduceAbstract
Cooperative training methods for distributed machine learning typically assume noiseless and ideal communication channels. This work studies some of the opportunities and challenges arising from the presence of wireless communication links. We specifically consider wireless implementations of Federated Learning (FL) and Federated Distillation (FD), as well as of a novel Hybrid Federated Distillation (HFD) scheme. Both digital implementations based on separate source-channel coding and over-the-air computing implementations based on joint source-channel coding are proposed and evaluated over Gaussian multiple-access channels.