SOTAVerified

An Accelerated Gradient Method for Convex Smooth Simple Bilevel Optimization

2024-02-12Unverified0· sign in to hype

Jincheng Cao, Ruichen Jiang, Erfan Yazdandoost Hamedani, Aryan Mokhtari

Unverified — Be the first to reproduce this paper.

Reproduce

Abstract

In this paper, we focus on simple bilevel optimization problems, where we minimize a convex smooth objective function over the optimal solution set of another convex smooth constrained optimization problem. We present a novel bilevel optimization method that locally approximates the solution set of the lower-level problem using a cutting plane approach and employs an accelerated gradient-based update to reduce the upper-level objective function over the approximated solution set. We measure the performance of our method in terms of suboptimality and infeasibility errors and provide non-asymptotic convergence guarantees for both error criteria. Specifically, when the feasible set is compact, we show that our method requires at most O(\1/_f, 1/_g\) iterations to find a solution that is _f-suboptimal and _g-infeasible. Moreover, under the additional assumption that the lower-level objective satisfies the r-th H\"olderian error bound, we show that our method achieves an iteration complexity of O(\_f^-2r-12r,_g^-2r-12r\), which matches the optimal complexity of single-level convex constrained optimization when r=1.

Tasks

Reproductions