ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2505.02985
12
0

More Optimal Fractional-Order Stochastic Gradient Descent for Non-Convex Optimization Problems

5 May 2025
Mohammad Partohaghighi
Roummel Marcia
YangQuan Chen
ArXivPDFHTML
Abstract

Fractional-order stochastic gradient descent (FOSGD) leverages fractional exponents to capture long-memory effects in optimization. However, its utility is often limited by the difficulty of tuning and stabilizing these exponents. We propose 2SED Fractional-Order Stochastic Gradient Descent (2SEDFOSGD), which integrates the Two-Scale Effective Dimension (2SED) algorithm with FOSGD to adapt the fractional exponent in a data-driven manner. By tracking model sensitivity and effective dimensionality, 2SEDFOSGD dynamically modulates the exponent to mitigate oscillations and hasten convergence. Theoretically, for onoconvex optimization problems, this approach preserves the advantages of fractional memory without the sluggish or unstable behavior observed in naïve fractional SGD. Empirical evaluations in Gaussian and α\alphaα-stable noise scenarios using an autoregressive (AR) model highlight faster convergence and more robust parameter estimates compared to baseline methods, underscoring the potential of dimension-aware fractional techniques for advanced modeling and estimation tasks.

View on arXiv
@article{partohaghighi2025_2505.02985,
  title={ More Optimal Fractional-Order Stochastic Gradient Descent for Non-Convex Optimization Problems },
  author={ Mohammad Partohaghighi and Roummel Marcia and YangQuan Chen },
  journal={arXiv preprint arXiv:2505.02985},
  year={ 2025 }
}
Comments on this paper