SOTAVerified

FedEGG: Federated Learning with Explicit Global Guidance

2024-04-18Unverified0· sign in to hype

Kun Zhai, Yifeng Gao, Difan Zou, Guangnan Ye, Siheng Chen, Xingjun Ma, Yu-Gang Jiang

Unverified — Be the first to reproduce this paper.

Reproduce

Abstract

Federated Learning (FL) holds great potential for diverse applications owing to its privacy-preserving nature. However, its convergence is often challenged by non-IID data distributions, limiting its effectiveness in real-world deployments. Existing methods help address these challenges via optimization-based client constraints, adaptive client selection, or the use of pre-trained models or synthetic data. In this work, we reinterpret these approaches as all introducing an implicit guiding task to regularize and steer client learning. Following this insight, we propose to introduce an explicit global guiding task into the current FL framework to improve convergence and performance. To this end, we present FedEGG, a new FL algorithm that constructs a global guiding task using a well-defined, easy-to-converge learning task based on a public dataset and Large Language Models (LLMs). This approach effectively combines the strengths of federated (the original FL task) and centralized (the global guiding task) learning. We provide a theoretical analysis of FedEGG's convergence, examining the impact of data heterogeneity between the guiding and FL tasks and the guiding strength. Our analysis derives an upper bound for the optimal guiding strength, offering practical insights for implementation. Empirically, FedEGG demonstrates superior performance over state-of-the-art FL methods under both IID and non-IID settings, and further improves their performances when combined.

Tasks

Reproductions