24
29

The First Optimal Acceleration of High-Order Methods in Smooth Convex Optimization

D. Kovalev
Alexander Gasnikov
Abstract

In this paper, we study the fundamental open question of finding the optimal high-order algorithm for solving smooth convex minimization problems. Arjevani et al. (2019) established the lower bound Ω(ϵ2/(3p+1))\Omega\left(\epsilon^{-2/(3p+1)}\right) on the number of the pp-th order oracle calls required by an algorithm to find an ϵ\epsilon-accurate solution to the problem, where the pp-th order oracle stands for the computation of the objective function value and the derivatives up to the order pp. However, the existing state-of-the-art high-order methods of Gasnikov et al. (2019b); Bubeck et al. (2019); Jiang et al. (2019) achieve the oracle complexity O(ϵ2/(3p+1)log(1/ϵ))\mathcal{O}\left(\epsilon^{-2/(3p+1)} \log (1/\epsilon)\right), which does not match the lower bound. The reason for this is that these algorithms require performing a complex binary search procedure, which makes them neither optimal nor practical. We fix this fundamental issue by providing the first algorithm with O(ϵ2/(3p+1))\mathcal{O}\left(\epsilon^{-2/(3p+1)}\right) pp-th order oracle complexity.

View on arXiv
Comments on this paper