ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2310.12969
  4. Cited By
Demystifying the Myths and Legends of Nonconvex Convergence of SGD

Demystifying the Myths and Legends of Nonconvex Convergence of SGD

19 October 2023
Aritra Dutta
El Houcine Bergou
Soumia Boucherouite
Nicklas Werge
M. Kandemir
Xin Li
ArXivPDFHTML

Papers citing "Demystifying the Myths and Legends of Nonconvex Convergence of SGD"

5 / 5 papers shown
Title
Stochastic Approximation Beyond Gradient for Signal Processing and
  Machine Learning
Stochastic Approximation Beyond Gradient for Signal Processing and Machine Learning
Aymeric Dieuleveut
G. Fort
Eric Moulines
Hoi-To Wai
40
12
0
22 Feb 2023
On the Convergence of Step Decay Step-Size for Stochastic Optimization
On the Convergence of Step Decay Step-Size for Stochastic Optimization
Xiaoyu Wang
Sindri Magnússon
M. Johansson
39
23
0
18 Feb 2021
A Simple Convergence Proof of Adam and Adagrad
A Simple Convergence Proof of Adam and Adagrad
Alexandre Défossez
Léon Bottou
Francis R. Bach
Nicolas Usunier
56
143
0
05 Mar 2020
Stochastic Nonconvex Optimization with Large Minibatches
Stochastic Nonconvex Optimization with Large Minibatches
Weiran Wang
Nathan Srebro
26
26
0
25 Sep 2017
Stochastic Gradient Descent for Non-smooth Optimization: Convergence
  Results and Optimal Averaging Schemes
Stochastic Gradient Descent for Non-smooth Optimization: Convergence Results and Optimal Averaging Schemes
Ohad Shamir
Tong Zhang
99
570
0
08 Dec 2012
1