ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2211.01851
6
15

Adaptive Stochastic Variance Reduction for Non-convex Finite-Sum Minimization

3 November 2022
Ali Kavis
Stratis Skoulakis
Kimon Antonakopoulos
L. Dadi
V. Cevher
ArXivPDFHTML
Abstract

We propose an adaptive variance-reduction method, called AdaSpider, for minimization of LLL-smooth, non-convex functions with a finite-sum structure. In essence, AdaSpider combines an AdaGrad-inspired [Duchi et al., 2011, McMahan & Streeter, 2010], but a fairly distinct, adaptive step-size schedule with the recursive stochastic path integrated estimator proposed in [Fang et al., 2018]. To our knowledge, Adaspider is the first parameter-free non-convex variance-reduction method in the sense that it does not require the knowledge of problem-dependent parameters, such as smoothness constant LLL, target accuracy ϵ\epsilonϵ or any bound on gradient norms. In doing so, we are able to compute an ϵ\epsilonϵ-stationary point with O~(n+n/ϵ2)\tilde{O}\left(n + \sqrt{n}/\epsilon^2\right)O~(n+n​/ϵ2) oracle-calls, which matches the respective lower bound up to logarithmic factors.

View on arXiv
Comments on this paper