SOTAVerified

Knowledge Distillation For Wireless Edge Learning

2021-04-03Code Available0· sign in to hype

Ahmed P. Mohamed, Abu Shafin Mohammad Mahdee Jameel, Aly El Gamal

Code Available — Be the first to reproduce this paper.

Reproduce

Code

Abstract

In this paper, we propose a framework for predicting frame errors in the collaborative spectrally congested wireless environments of the DARPA Spectrum Collaboration Challenge (SC2) via a recently collected dataset. We employ distributed deep edge learning that is shared among edge nodes and a central cloud. Using this close-to-practice dataset, we find that widely used federated learning approaches, specially those that are privacy preserving, are worse than local training for a wide range of settings. We hence utilize the synthetic minority oversampling technique to maintain privacy via avoiding the transfer of local data to the cloud, and utilize knowledge distillation with an aim to benefit from high cloud computing and storage capabilities. The proposed framework achieves overall better performance than both local and federated training approaches, while being robust against catastrophic failures as well as challenging channel conditions that result in high frame error rates.

Tasks

Reproductions