ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2108.02072
13
11
v1v2v3v4 (latest)

Stochastic Subgradient Descent Escapes Active Strict Saddles on Weakly Convex Functions

4 August 2021
Pascal Bianchi
W. Hachem
S. Schechtman
ArXiv (abs)PDFHTML
Abstract

In non-smooth stochastic optimization, we establish the non-convergence of the stochastic subgradient descent (SGD) to the critical points recently called active strict saddles by Davis and Drusvyatskiy. Such points lie on a manifold MMM where the function fff has a direction of second-order negative curvature. Off this manifold, the norm of the Clarke subdifferential of fff is lower-bounded. We require two conditions on fff. The first assumption is a Verdier stratification condition, which is a refinement of the popular Whitney stratification. It allows us to establish a reinforced version of the projection formula of Bolte \emph{et.al.} for Whitney stratifiable functions, and which is of independent interest. The second assumption, termed the angle condition, allows to control the distance of the iterates to MMM. When fff is weakly convex, our assumptions are generic. Consequently, generically in the class of definable weakly convex functions, the SGD converges to a local minimizer.

View on arXiv
Comments on this paper