Non-Euclidean High-Order Smooth Convex Optimization
We develop algorithms for the optimization of convex objectives that have Hölder continuous -th derivatives by using a -th order oracle, for any . Our algorithms work for general norms under mild conditions, including the -settings for . We can also optimize structured functions that allow for inexactly implementing a non-Euclidean ball optimization oracle. We do this by developing a non-Euclidean inexact accelerated proximal point method that makes use of an \emph{inexact uniformly convex regularizer}. We show a lower bound for general norms that demonstrates our algorithms are nearly optimal in high-dimensions in the black-box oracle model for -settings and all , even in randomized and parallel settings. This new lower bound, when applied to the first-order smooth case, resolves an open question in parallel convex optimization.
View on arXiv