AMSFL: Adaptive Multi-Step Federated Learning via Gradient Difference-Based Error Modeling
2025-05-27Unverified0· sign in to hype
Ganglou Xu
Unverified — Be the first to reproduce this paper.
ReproduceAbstract
Federated learning faces critical challenges in balancing communication efficiency and model accuracy. One key issue lies in the approximation of update errors without incurring high computational costs. In this paper, we propose a lightweight yet effective method called Gradient Difference Approximation (GDA), which leverages first-order information to estimate local error trends without computing the full Hessian matrix. The proposed method forms a key component of the Adaptive Multi-Step Federated Learning (AMSFL) framework and provides a unified error modeling strategy for large-scale multi-step adaptive training environments.