Recent advances (Sherman, 2017; Sidford and Tian, 2018; Cohen et al., 2021) have overcome the fundamental barrier of dimension dependence in the iteration complexity of solving regression with first-order methods. Yet it remains unclear to what extent such acceleration can be achieved for general smooth functions. In this paper, we propose a new accelerated first-order method for convex optimization under non-Euclidean smoothness assumptions. In contrast to standard acceleration techniques, our approach uses primal-dual iterate sequences taken with respect to norms, which are then coupled using an determined interpolation parameter. For norm smooth problems in dimensions, our method provides an iteration complexity improvement of up to in terms of calls to a first-order oracle, thereby allowing us to circumvent long-standing barriers in accelerated non-Euclidean steepest descent.
View on arXiv@article{bai2025_2409.19200, title={ Faster Acceleration for Steepest Descent }, author={ Site Bai and Brian Bullins }, journal={arXiv preprint arXiv:2409.19200}, year={ 2025 } }