SOTAVerified

Fast Decentralized Gradient Tracking for Federated Minimax Optimization with Local Updates

2024-05-07Unverified0· sign in to hype

Chris Junchi Li

Unverified — Be the first to reproduce this paper.

Reproduce

Abstract

Federated learning (FL) for minimax optimization has emerged as a powerful paradigm for training models across distributed nodes/clients while preserving data privacy and model robustness on data heterogeneity. In this work, we delve into the decentralized implementation of federated minimax optimization by proposing K-GT-Minimax, a novel decentralized minimax optimization algorithm that combines local updates and gradient tracking techniques. Our analysis showcases the algorithm's communication efficiency and convergence rate for nonconvex-strongly-concave (NC-SC) minimax optimization, demonstrating a superior convergence rate compared to existing methods. K-GT-Minimax's ability to handle data heterogeneity and ensure robustness underscores its significance in advancing federated learning research and applications.

Tasks

Reproductions