ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2004.07177
  4. Cited By
Analysis of Stochastic Gradient Descent in Continuous Time

Analysis of Stochastic Gradient Descent in Continuous Time

15 April 2020
J. Latz
ArXivPDFHTML

Papers citing "Analysis of Stochastic Gradient Descent in Continuous Time"

6 / 6 papers shown
Title
The Expected Loss of Preconditioned Langevin Dynamics Reveals the
  Hessian Rank
The Expected Loss of Preconditioned Langevin Dynamics Reveals the Hessian Rank
Amitay Bar
Rotem Mulayoff
T. Michaeli
Ronen Talmon
49
0
0
21 Feb 2024
Beyond first-order methods for non-convex non-concave min-max
  optimization
Beyond first-order methods for non-convex non-concave min-max optimization
Abhijeet Vyas
Brian Bullins
10
1
0
17 Apr 2023
Online Learning for the Random Feature Model in the Student-Teacher
  Framework
Online Learning for the Random Feature Model in the Student-Teacher Framework
Roman Worschech
B. Rosenow
31
0
0
24 Mar 2023
Analysis of Kinetic Models for Label Switching and Stochastic Gradient
  Descent
Analysis of Kinetic Models for Label Switching and Stochastic Gradient Descent
Martin Burger
Alex Rossi
14
1
0
01 Jul 2022
The Zig-Zag Process and Super-Efficient Sampling for Bayesian Analysis
  of Big Data
The Zig-Zag Process and Super-Efficient Sampling for Bayesian Analysis of Big Data
J. Bierkens
Paul Fearnhead
Gareth O. Roberts
50
232
0
11 Jul 2016
The Loss Surfaces of Multilayer Networks
The Loss Surfaces of Multilayer Networks
A. Choromańska
Mikael Henaff
Michaël Mathieu
Gerard Ben Arous
Yann LeCun
ODL
175
1,182
0
30 Nov 2014
1