SOTAVerified

Private Aggregation in Hierarchical Wireless Federated Learning with Partial and Full Collusion

2023-06-25Unverified0· sign in to hype

Maximilian Egger, Christoph Hofmeister, Antonia Wachter-Zeh, Rawad Bitar

Unverified — Be the first to reproduce this paper.

Reproduce

Abstract

In federated learning, a federator coordinates the training of a model, e.g., a neural network, on privately owned data held by several participating clients. The gradient descent algorithm, a well-known and popular iterative optimization procedure, is run to train the model. Every client computes partial gradients based on their local data and sends them to the federator, which aggregates the results and updates the model. Privacy of the clients' data is a major concern. In fact, it is shown that observing the partial gradients can be enough to reveal the clients' data. Existing literature focuses on private aggregation schemes that tackle the privacy problem in federated learning in settings where all users are connected to each other and to the federator. In this paper, we consider a hierarchical wireless system architecture in which the clients are connected to base stations; the base stations are connected to the federator either directly or through relays. We examine settings with and without relays, and derive fundamental limits on the communication cost under information-theoretic privacy with different collusion assumptions. We introduce suitable private aggregation schemes tailored for these settings whose communication costs are multiplicative factors away from the derived bounds.

Tasks

Reproductions