33
0

Adaptive Accelerated Proximal Gradient Methods with Variance Reduction for Composite Nonconvex Finite-Sum Minimization

Abstract

This paper proposes {\sf AAPG-SPIDER}, an Adaptive Accelerated Proximal Gradient (AAPG) method with variance reduction for minimizing composite nonconvex finite-sum functions. It integrates three acceleration techniques: adaptive stepsizes, Nesterov's extrapolation, and the recursive stochastic path-integrated estimator SPIDER. While targeting stochastic finite-sum problems, {\sf AAPG-SPIDER} simplifies to {\sf AAPG} in the full-batch, non-stochastic setting, which is also of independent interest. To our knowledge, {\sf AAPG-SPIDER} and {\sf AAPG} are the first learning-rate-free methods to achieve optimal iteration complexity for this class of \textit{composite} minimization problems. Specifically, {\sf AAPG} achieves the optimal iteration complexity of O(Nϵ2)\mathcal{O}(N \epsilon^{-2}), while {\sf AAPG-SPIDER} achieves O(N+Nϵ2)\mathcal{O}(N + \sqrt{N} \epsilon^{-2}) for finding ϵ\epsilon-approximate stationary points, where NN is the number of component functions. Under the Kurdyka-Lojasiewicz (KL) assumption, we establish non-ergodic convergence rates for both methods. Preliminary experiments on sparse phase retrieval and linear eigenvalue problems demonstrate the superior performance of {\sf AAPG-SPIDER} and {\sf AAPG} compared to existing methods.

View on arXiv
@article{yuan2025_2502.21099,
  title={ Adaptive Accelerated Proximal Gradient Methods with Variance Reduction for Composite Nonconvex Finite-Sum Minimization },
  author={ Ganzhao Yuan },
  journal={arXiv preprint arXiv:2502.21099},
  year={ 2025 }
}
Comments on this paper