ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2412.14488
73
0

A stochastic first-order method with multi-extrapolated momentum for highly smooth unconstrained optimization

19 December 2024
Chuan He
ArXivPDFHTML
Abstract

In this paper, we consider an unconstrained stochastic optimization problem where the objective function exhibits high-order smoothness. Specifically, we propose a new stochastic first-order method (SFOM) with multi-extrapolated momentum, in which multiple extrapolations are performed in each iteration, followed by a momentum update based on these extrapolations. We demonstrate that the proposed SFOM can accelerate optimization by exploiting the high-order smoothness of the objective function fff. Assuming that the pppth-order derivative of fff is Lipschitz continuous for some p≥2p\ge2p≥2, and under additional mild assumptions, we establish that our method achieves a sample complexity of O~(ϵ−(3p+1)/p)\widetilde{\mathcal{O}}(\epsilon^{-(3p+1)/p})O(ϵ−(3p+1)/p) for finding a point xxx such that E[∥∇f(x)∥]≤ϵ\mathbb{E}[\|\nabla f(x)\|]\le\epsilonE[∥∇f(x)∥]≤ϵ. To the best of our knowledge, this is the first SFOM to leverage arbitrary-order smoothness of the objective function for acceleration, resulting in a sample complexity that improves upon the best-known results without assuming the mean-squared smoothness condition. Preliminary numerical experiments validate the practical performance of our method and support our theoretical findings.

View on arXiv
@article{he2025_2412.14488,
  title={ A stochastic first-order method with multi-extrapolated momentum for highly smooth unconstrained optimization },
  author={ Chuan He },
  journal={arXiv preprint arXiv:2412.14488},
  year={ 2025 }
}
Comments on this paper