ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2108.02072
  4. Cited By
Stochastic Subgradient Descent Escapes Active Strict Saddles on Weakly
  Convex Functions
v1v2v3v4 (latest)

Stochastic Subgradient Descent Escapes Active Strict Saddles on Weakly Convex Functions

4 August 2021
Pascal Bianchi
W. Hachem
S. Schechtman
ArXiv (abs)PDFHTML

Papers citing "Stochastic Subgradient Descent Escapes Active Strict Saddles on Weakly Convex Functions"

2 / 2 papers shown
Title
Stochastic Subgradient Descent on a Generic Definable Function Converges to a Minimizer
S. Schechtman
48
1
0
06 Sep 2021
Active manifolds, stratifications, and convergence to local minima in
  nonsmooth optimization
Active manifolds, stratifications, and convergence to local minima in nonsmooth optimization
Damek Davis
Dmitriy Drusvyatskiy
L. Jiang
61
9
0
26 Aug 2021
1