Accelerating Inexact HyperGradient Descent for Bilevel Optimization

We present a method for solving general nonconvex-strongly-convex bilevel optimization problems. Our method -- the \emph{Restarted Accelerated HyperGradient Descent} (\texttt{RAHGD}) method -- finds an -first-order stationary point of the objective with oracle complexity, where is the condition number of the lower-level objective and is the desired accuracy. We also propose a perturbed variant of \texttt{RAHGD} for finding an -second-order stationary point within the same order of oracle complexity. Our results achieve the best-known theoretical guarantees for finding stationary points in bilevel optimization and also improve upon the existing upper complexity bound for finding second-order stationary points in nonconvex-strongly-concave minimax optimization problems, setting a new state-of-the-art benchmark. Empirical studies are conducted to validate the theoretical results in this paper.
View on arXiv