362

A Novel Hessian-Free Bilevel Optimizer via Evolution Strategies

Abstract

Bilevel optimization has arisen as a powerful tool for solving many modern machine learning problems. However, due to the nested structure of bilevel optimization, even gradient-based methods require second-order derivative approximations via Jacobian- or/and Hessian-vector computations, which can be very costly in practice. In this work, we propose a novel Hessian-free bilevel algorithm, which adopts the Evolution Strategies (ES) method to approximate the response Jacobian matrix in the hypergradient of the bilevel problem, and hence fully eliminates all second-order computations. We call our algorithm as ESJ (which stands for the ES-based Jacobian method) and further extend it to the stochastic setting as ESJ-S. Theoretically, we show that both ESJ and ESJ-S are guaranteed to converge. Experimentally, we demonstrate that the proposed algorithms outperform baseline bilevel optimizers on various bilevel problems. Particularly, in our experiment on few-shot meta-learning of ResNet-12 network over the miniImageNet dataset, we show that our algorithm outperforms baseline meta-learning algorithms, while other baseline bilevel optimizers do not solve such meta-learning problems within a comparable time frame.

View on arXiv
Comments on this paper