ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1703.10993
83
40
v1v2v3 (latest)

Catalyst Acceleration for Gradient-Based Non-Convex Optimization

31 March 2017
Courtney Paquette
Hongzhou Lin
Dmitriy Drusvyatskiy
Julien Mairal
Zaïd Harchaoui
    ODL
ArXiv (abs)PDFHTML
Abstract

We introduce a generic scheme to solve nonconvex optimization problems using gradient-based algorithms originally designed for minimizing convex functions. When the objective is convex, the proposed approach enjoys the same properties as the Catalyst approach of Lin et al. [22]. When the objective is nonconvex, it achieves the best known convergence rate to stationary points for first-order methods. Specifically, the proposed algorithm does not require knowledge about the convexity of the objective; yet, it obtains an overall worst-case efficiency of O~(ε−2)\tilde{O}(\varepsilon^{-2})O~(ε−2) and, if the function is convex, the complexity reduces to the near-optimal rate O~(ε−2/3)\tilde{O}(\varepsilon^{-2/3})O~(ε−2/3). We conclude the paper by showing promising experimental results obtained by applying the proposed approach to SVRG and SAGA for sparse matrix factorization and for learning neural networks.

View on arXiv
Comments on this paper