Randomized Block-Coordinate Optimistic Gradient Algorithms for Root-Finding Problems
Quoc Tran-Dinh, Yang Luo
Unverified — Be the first to reproduce this paper.
ReproduceAbstract
In this paper, we develop two new randomized block-coordinate optimistic gradient algorithms to approximate a solution of nonlinear equations in large-scale settings, which are called root-finding problems. Our first algorithm is non-accelerated with constant stepsizes, and achieves O(1/k) best-iterate convergence rate on E[ Gx^k^2] when the underlying operator G is Lipschitz continuous and satisfies a weak Minty solution condition, where E[] is the expectation and k is the iteration counter. Our second method is a new accelerated randomized block-coordinate optimistic gradient algorithm. We establish both O(1/k^2) and o(1/k^2) last-iterate convergence rates on both E[ Gx^k^2] and E[ x^k+1 - x^k^2] for this algorithm under the co-coerciveness of G. In addition, we prove that the iterate sequence ^k\ converges to a solution almost surely, and k Gx^k attains a o(1/k) almost sure convergence rate. Then, we apply our methods to a class of large-scale finite-sum inclusions, which covers prominent applications in machine learning, statistical learning, and network optimization, especially in federated learning. We obtain two new federated learning-type algorithms and their convergence rate guarantees for solving this problem class.