SOTAVerified

Federated Learning with Data-Agnostic Distribution Fusion

2021-09-29CVPR 2023Unverified0· sign in to hype

Jian-hui Duan, Wenzhong Li, Sanglu Lu

Unverified — Be the first to reproduce this paper.

Reproduce

Abstract

Federated learning has emerged as a promising distributed machine learning paradigm to preserve data privacy. One of the fundamental challenges of federated learning is that data samples across clients are usually not independent and identically distributed (non-IID), leading to slow convergence and severe performance drop of the aggregated global model. In this paper, we propose a novel data-agnostic distribution fusion based model aggregation method called FedDAF to optimize federated learning with non-IID local datasets, based on which the heterogeneous clients' data distributions can be represented by the fusion of several virtual components with different parameters and weights. We develop a variational autoencoder (VAE) method to derive the optimal parameters for the fusion distribution using the limited statistical information extracted from local models, which optimizes model aggregation for federated learning by solving a probabilistic maximization problem. Extensive experiments based on various federated learning scenarios with real-world datasets show that FedDAF achieves significant performance improvement compared to the state-of-the-art.

Tasks

Reproductions