Theoretical Framework for Tempered Fractional Gradient Descent: Application to Breast Cancer Classification

This paper introduces Tempered Fractional Gradient Descent (TFGD), a novel optimization framework that synergizes fractional calculus with exponential tempering to enhance gradient-based learning. Traditional gradient descent methods often suffer from oscillatory updates and slow convergence in high-dimensional, noisy landscapes. TFGD addresses these limitations by incorporating a tempered memory mechanism, where historical gradients are weighted by fractional coefficients and exponentially decayed via a tempering parameter . Theoretical analysis establishes TFGD's convergence guarantees: in convex settings, it achieves an rate with alignment coefficient , while stochastic variants attain error decay. The algorithm maintains time complexity equivalent to SGD, with memory overhead scaling as for parameter dimension . Empirical validation on the Breast Cancer Wisconsin dataset demonstrates TFGD's superiority, achieving 98.25\% test accuracy (vs. 92.11\% for SGD) and 2 faster convergence. The tempered memory mechanism proves particularly effective in medical classification tasks, where feature correlations benefit from stable gradient averaging. These results position TFGD as a robust alternative to conventional optimizers in both theoretical and applied machine learning.
View on arXiv@article{naifar2025_2504.18849, title={ Theoretical Framework for Tempered Fractional Gradient Descent: Application to Breast Cancer Classification }, author={ Omar Naifar }, journal={arXiv preprint arXiv:2504.18849}, year={ 2025 } }