An Efficient Stochastic Algorithm for Decentralized Nonconvex-Strongly-Concave Minimax Optimization
2022-12-05Code Available0· sign in to hype
Lesi Chen, Haishan Ye, Luo Luo
Code Available — Be the first to reproduce this paper.
ReproduceCode
- github.com/TrueNobility303/DREAMOfficialnone★ 6
Abstract
This paper studies the stochastic nonconvex-strongly-concave minimax optimization over a multi-agent network. We propose an efficient algorithm, called Decentralized Recursive gradient descEnt Ascent Method (DREAM), which achieves the best-known theoretical guarantee for finding the -stationary points. Concretely, it requires O( (^3^-3,^2 N ^-2 )) stochastic first-order oracle (SFO) calls and O(^2 ^-2) communication rounds, where is the condition number and N is the total number of individual functions. Our numerical experiments also validate the superiority of DREAM over previous methods.