39
0

Effective Dimension Aware Fractional-Order Stochastic Gradient Descent for Convex Optimization Problems

Abstract

Fractional-order stochastic gradient descent (FOSGD) leverages fractional exponents to capture long-memory effects in optimization. However, its utility is often limited by the difficulty of tuning and stabilizing these exponents. We propose 2SED Fractional-Order Stochastic Gradient Descent (2SEDFOSGD), which integrates the Two-Scale Effective Dimension (2SED) algorithm with FOSGD to adapt the fractional exponent in a data-driven manner. By tracking model sensitivity and effective dimensionality, 2SEDFOSGD dynamically modulates the exponent to mitigate oscillations and hasten convergence. Theoretically, this approach preserves the advantages of fractional memory without the sluggish or unstable behavior observed in naïve fractional SGD. Empirical evaluations in Gaussian and α\alpha-stable noise scenarios using an autoregressive (AR) model\textcolor{red}{, as well as on the MNIST and CIFAR-100 datasets for image classification,} highlight faster convergence and more robust parameter estimates compared to baseline methods, underscoring the potential of dimension-aware fractional techniques for advanced modeling and estimation tasks.

View on arXiv
@article{partohaghighi2025_2503.13764,
  title={ Effective Dimension Aware Fractional-Order Stochastic Gradient Descent for Convex Optimization Problems },
  author={ Mohammad Partohaghighi and Roummel Marcia and YangQuan Chen },
  journal={arXiv preprint arXiv:2503.13764},
  year={ 2025 }
}
Comments on this paper