174

Recursive Bound-Constrained AdaGrad with Applications to Multilevel and Domain Decomposition Minimization

Serge Gratton
Alena Kopaničáková
Philippe Toint
Main:29 Pages
7 Figures
Bibliography:4 Pages
5 Tables
Abstract

Two OFFO (Objective-Function Free Optimization) noise tolerant algorithms are presented that handle bound constraints, inexact gradients and use second-order information whenthis http URLfirst is a multi-level method exploiting a hierarchical description of the problem and the second is a domain-decomposition method covering the standard addditive Schwarz decompositions. Both are generalizations of the first-order AdaGrad algorithm for unconstrained optimization. Because these algorithms share a common theoretical framework, a single convergence/complexity theory is provided which covers them both. Its main result is that, with high probability, both methods need at most O(ϵ2)O(\epsilon^{-2}) iterations and noisy gradient evaluations to compute an ϵ\epsilon-approximate first-order critical point of the bound-constrained problem. Extensive numerical experiments are discussed on applications ranging from PDE-based problems to deep neural network training, illustrating their remarkable computational efficiency.

View on arXiv
Comments on this paper