In this paper, we consider an unconstrained stochastic optimization problem where the objective function exhibits high-order smoothness. Specifically, we propose a new stochastic first-order method (SFOM) with multi-extrapolated momentum, in which multiple extrapolations are performed in each iteration, followed by a momentum update based on these extrapolations. We demonstrate that the proposed SFOM can accelerate optimization by exploiting the high-order smoothness of the objective function . Assuming that the th-order derivative of is Lipschitz continuous for some , and under additional mild assumptions, we establish that our method achieves a sample complexity of for finding a point such that . To the best of our knowledge, this is the first SFOM to leverage arbitrary-order smoothness of the objective function for acceleration, resulting in a sample complexity that improves upon the best-known results without assuming the mean-squared smoothness condition. Preliminary numerical experiments validate the practical performance of our method and support our theoretical findings.
View on arXiv@article{he2025_2412.14488, title={ A stochastic first-order method with multi-extrapolated momentum for highly smooth unconstrained optimization }, author={ Chuan He }, journal={arXiv preprint arXiv:2412.14488}, year={ 2025 } }