SOTAVerified

Improving Local Training in Federated Learning via Temperature Scaling

2024-01-18Unverified0· sign in to hype

Kichang Lee, Songkuk Kim, JeongGil Ko

Unverified — Be the first to reproduce this paper.

Reproduce

Abstract

Federated learning is inherently hampered by data heterogeneity: non-i.i.d. training data over local clients. We propose a novel model training approach for federated learning, FLex&Chill, which exploits the Logit Chilling method. Through extensive evaluations, we demonstrate that, in the presence of non-i.i.d. data characteristics inherent in federated learning systems, this approach can expedite model convergence and improve inference accuracy. Quantitatively, from our experiments, we observe up to 6X improvement in the global federated learning model convergence time, and up to 3.37% improvement in inference accuracy.

Tasks

Reproductions