Adaptive Accelerated Proximal Gradient Methods with Variance Reduction for Composite Nonconvex Finite-Sum Minimization

This paper proposes {\sf AAPG-SPIDER}, an Adaptive Accelerated Proximal Gradient (AAPG) method with variance reduction for minimizing composite nonconvex finite-sum functions. It integrates three acceleration techniques: adaptive stepsizes, Nesterov's extrapolation, and the recursive stochastic path-integrated estimator SPIDER. While targeting stochastic finite-sum problems, {\sf AAPG-SPIDER} simplifies to {\sf AAPG} in the full-batch, non-stochastic setting, which is also of independent interest. To our knowledge, {\sf AAPG-SPIDER} and {\sf AAPG} are the first learning-rate-free methods to achieve optimal iteration complexity for this class of \textit{composite} minimization problems. Specifically, {\sf AAPG} achieves the optimal iteration complexity of , while {\sf AAPG-SPIDER} achieves for finding -approximate stationary points, where is the number of component functions. Under the Kurdyka-Lojasiewicz (KL) assumption, we establish non-ergodic convergence rates for both methods. Preliminary experiments on sparse phase retrieval and linear eigenvalue problems demonstrate the superior performance of {\sf AAPG-SPIDER} and {\sf AAPG} compared to existing methods.
View on arXiv@article{yuan2025_2502.21099, title={ Adaptive Accelerated Proximal Gradient Methods with Variance Reduction for Composite Nonconvex Finite-Sum Minimization }, author={ Ganzhao Yuan }, journal={arXiv preprint arXiv:2502.21099}, year={ 2025 } }