9

On the Convergence of Single-Loop Stochastic Bilevel Optimization with Approximate Implicit Differentiation

Yubo Zhou
Luo Luo
Guang Dai
Haishan Ye
Main:11 Pages
Bibliography:3 Pages
Appendix:11 Pages
Abstract

Stochastic Bilevel Optimization has emerged as a fundamental framework for meta-learning and hyperparameter optimization. Despite the practical prevalence of single-loop algorithms--which update lower and upper variables concurrently--their theoretical understanding, particularly in the stochastic regime, remains significantly underdeveloped compared to their multi-loop counterparts. Existing analyses often yield suboptimal convergence rates or obscure the critical dependence on the lower-level condition number κ\kappa, frequently burying it within generic Lipschitz constants. In this paper, we bridge this gap by providing a refined convergence analysis of the Single-loop Stochastic Approximate Implicit Differentiation (SSAID) algorithm. We prove that SSAID achieves an ϵ\epsilon-stationary point with an oracle complexity of O(κ7ϵ2)\mathcal{O}(\kappa^7 \epsilon^{-2}). Our result is noteworthy in two aspects: (i) it matches the optimal O(ϵ2)\mathcal{O}(\epsilon^{-2}) rate of state-of-the-art multi-loop methods (e.g., stocBiO) while maintaining the computational efficiency of a single-loop update; and (ii) it provides the first explicit, fine-grained characterization of the κ\kappa-dependence for stochastic AID-based single-loop methods. This work demonstrates that SSAID is not merely a heuristic approach, but admits a rigorous theoretical foundation with convergence guarantees competitive with mainstream multi-loop frameworks.

View on arXiv
Comments on this paper