SOTAVerified

Supplementary File: Cooperative Gradient Coding for Semi-Decentralized Federated Learning

2024-03-31Unverified0· sign in to hype

Shudi Weng, Chengxi Li, Ming Xiao, Mikael Skoglund

Unverified — Be the first to reproduce this paper.

Reproduce

Abstract

Stragglers' effects are known to degrade FL performance. In this paper, we investigate federated learning (FL) over wireless networks in the presence of communication stragglers, where the power-constrained clients collaboratively train a global model by iteratively optimizing a local objective function with their local datasets and transmitting local model updates to the central parameter server (PS) through fading channels. To tackle communication stragglers without dataset sharing or prior information about the network at PS, we propose cooperative gradient coding (CoGC) for semi-decentralized FL to enable the exact global model recovery at PS. Furthermore, we conduct a thorough theoretical analysis of the proposed approach. Namely, an outage analysis of the proposed approach is provided, followed by a convergence analysis based on the failure probability of the global model recovery at PS. Nevertheless, simulation results reveal the superiority of the proposed approach in the presence of stragglers under imbalanced data distribution.

Tasks

Reproductions