19
0

Theoretical Framework for Tempered Fractional Gradient Descent: Application to Breast Cancer Classification

Abstract

This paper introduces Tempered Fractional Gradient Descent (TFGD), a novel optimization framework that synergizes fractional calculus with exponential tempering to enhance gradient-based learning. Traditional gradient descent methods often suffer from oscillatory updates and slow convergence in high-dimensional, noisy landscapes. TFGD addresses these limitations by incorporating a tempered memory mechanism, where historical gradients are weighted by fractional coefficients wj=(αj)|w_j| = \binom{\alpha}{j} and exponentially decayed via a tempering parameter λ\lambda. Theoretical analysis establishes TFGD's convergence guarantees: in convex settings, it achieves an O(1/K)\mathcal{O}(1/K) rate with alignment coefficient dα,λ=(1eλ)αd_{\alpha,\lambda} = (1 - e^{-\lambda})^{-\alpha}, while stochastic variants attain O(1/kα)\mathcal{O}(1/k^\alpha) error decay. The algorithm maintains O(n)\mathcal{O}(n) time complexity equivalent to SGD, with memory overhead scaling as O(d/λ)\mathcal{O}(d/\lambda) for parameter dimension dd. Empirical validation on the Breast Cancer Wisconsin dataset demonstrates TFGD's superiority, achieving 98.25\% test accuracy (vs. 92.11\% for SGD) and 2×\times faster convergence. The tempered memory mechanism proves particularly effective in medical classification tasks, where feature correlations benefit from stable gradient averaging. These results position TFGD as a robust alternative to conventional optimizers in both theoretical and applied machine learning.

View on arXiv
@article{naifar2025_2504.18849,
  title={ Theoretical Framework for Tempered Fractional Gradient Descent: Application to Breast Cancer Classification },
  author={ Omar Naifar },
  journal={arXiv preprint arXiv:2504.18849},
  year={ 2025 }
}
Comments on this paper