SOTAVerified

Accelerating Inexact HyperGradient Descent for Bilevel Optimization

2023-06-30Unverified0· sign in to hype

Haikuo Yang, Luo Luo, Chris Junchi Li, Michael I. Jordan

Unverified — Be the first to reproduce this paper.

Reproduce

Abstract

We present a method for solving general nonconvex-strongly-convex bilevel optimization problems. Our method -- the Restarted Accelerated HyperGradient Descent (RAHGD) method -- finds an -first-order stationary point of the objective with O(^3.25^-1.75) oracle complexity, where is the condition number of the lower-level objective and is the desired accuracy. We also propose a perturbed variant of RAHGD for finding an (,O(^2.5\,))-second-order stationary point within the same order of oracle complexity. Our results achieve the best-known theoretical guarantees for finding stationary points in bilevel optimization and also improve upon the existing upper complexity bound for finding second-order stationary points in nonconvex-strongly-concave minimax optimization problems, setting a new state-of-the-art benchmark. Empirical studies are conducted to validate the theoretical results in this paper.

Tasks

Reproductions