18
13

Accelerating Inexact HyperGradient Descent for Bilevel Optimization

Abstract

We present a method for solving general nonconvex-strongly-convex bilevel optimization problems. Our method -- the \emph{Restarted Accelerated HyperGradient Descent} (\texttt{RAHGD}) method -- finds an ϵ\epsilon-first-order stationary point of the objective with O~(κ3.25ϵ1.75)\tilde{\mathcal{O}}(\kappa^{3.25}\epsilon^{-1.75}) oracle complexity, where κ\kappa is the condition number of the lower-level objective and ϵ\epsilon is the desired accuracy. We also propose a perturbed variant of \texttt{RAHGD} for finding an (ϵ,O(κ2.5ϵ))\big(\epsilon,\mathcal{O}(\kappa^{2.5}\sqrt{\epsilon}\,)\big)-second-order stationary point within the same order of oracle complexity. Our results achieve the best-known theoretical guarantees for finding stationary points in bilevel optimization and also improve upon the existing upper complexity bound for finding second-order stationary points in nonconvex-strongly-concave minimax optimization problems, setting a new state-of-the-art benchmark. Empirical studies are conducted to validate the theoretical results in this paper.

View on arXiv
Comments on this paper