SOTAVerified

FedEM: A Privacy-Preserving Framework for Concurrent Utility Preservation in Federated Learning

2025-03-08Unverified0· sign in to hype

Mingcong Xu, Xiaojin Zhang, Wei Chen, Hai Jin

Unverified — Be the first to reproduce this paper.

Reproduce

Abstract

Federated Learning (FL) enables collaborative training of models across distributed clients without sharing local data, addressing privacy concerns in decentralized systems. However, the gradient-sharing process exposes private data to potential leakage, compromising FL's privacy guarantees in real-world applications. To address this issue, we propose Federated Error Minimization (FedEM), a novel algorithm that incorporates controlled perturbations through adaptive noise injection. This mechanism effectively mitigates gradient leakage attacks while maintaining model performance. Experimental results on benchmark datasets demonstrate that FedEM significantly reduces privacy risks and preserves model accuracy, achieving a robust balance between privacy protection and utility preservation.

Tasks

Reproductions