ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2502.01076
82
0

qNBO: quasi-Newton Meets Bilevel Optimization

3 February 2025
Sheng Fang
Yong Liu
Wei-Ting Yao
Chengming Yu
Jin Zhang
ArXivPDFHTML
Abstract

Bilevel optimization, addressing challenges in hierarchical learning tasks, has gained significant interest in machine learning. The practical implementation of the gradient descent method to bilevel optimization encounters computational hurdles, notably the computation of the exact lower-level solution and the inverse Hessian of the lower-level objective. Although these two aspects are inherently connected, existing methods typically handle them separately by solving the lower-level problem and a linear system for the inverse Hessian-vector product. In this paper, we introduce a general framework to address these computational challenges in a coordinated manner. Specifically, we leverage quasi-Newton algorithms to accelerate the resolution of the lower-level problem while efficiently approximating the inverse Hessian-vector product. Furthermore, by exploiting the superlinear convergence properties of BFGS, we establish the non-asymptotic convergence analysis of the BFGS adaptation within our framework. Numerical experiments demonstrate the comparable or superior performance of the proposed algorithms in real-world learning tasks, including hyperparameter optimization, data hyper-cleaning, and few-shot meta-learning.

View on arXiv
@article{fang2025_2502.01076,
  title={ qNBO: quasi-Newton Meets Bilevel Optimization },
  author={ Sheng Fang and Yong-Jin Liu and Wei Yao and Chengming Yu and Jin Zhang },
  journal={arXiv preprint arXiv:2502.01076},
  year={ 2025 }
}
Comments on this paper