SOTAVerified

Second-Order Min-Max Optimization with Lazy Hessians

2024-10-12Unverified0· sign in to hype

Lesi Chen, Chengchang Liu, Jingzhao Zhang

Unverified — Be the first to reproduce this paper.

Reproduce

Abstract

This paper studies second-order methods for convex-concave minimax optimization. Monteiro and Svaiter (2012) proposed a method to solve the problem with an optimal iteration complexity of O(^-3/2) to find an -saddle point. However, it is unclear whether the computational complexity, O((N+ d^2) d ^-2/3), can be improved. In the above, we follow Doikov et al. (2023) and assume the complexity of obtaining a first-order oracle as N and the complexity of obtaining a second-order oracle as dN. In this paper, we show that the computation cost can be reduced by reusing Hessian across iterations. Our methods take the overall computational complexity of O( (N+d^2)(d+ d^2/3^-2/3)), which improves those of previous methods by a factor of d^1/3. Furthermore, we generalize our method to strongly-convex-strongly-concave minimax problems and establish the complexity of O((N+d^2) (d + d^2/3 ^2/3) ) when the condition number of the problem is , enjoying a similar speedup upon the state-of-the-art method. Numerical experiments on both real and synthetic datasets also verify the efficiency of our method.

Tasks

Reproductions