ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2308.06671
  4. Cited By
Law of Balance and Stationary Distribution of Stochastic Gradient
  Descent

Law of Balance and Stationary Distribution of Stochastic Gradient Descent

13 August 2023
Liu Ziyin
Hongchao Li
Masakuni Ueda
ArXivPDFHTML

Papers citing "Law of Balance and Stationary Distribution of Stochastic Gradient Descent"

6 / 6 papers shown
Title
Do Parameters Reveal More than Loss for Membership Inference?
Do Parameters Reveal More than Loss for Membership Inference?
Anshuman Suri
Xiao Zhang
David E. Evans
MIACV
MIALM
AAML
44
1
0
17 Jun 2024
Stochastic Thermodynamics of Learning Parametric Probabilistic Models
Stochastic Thermodynamics of Learning Parametric Probabilistic Models
S. Parsi
31
0
0
04 Oct 2023
Type-II Saddles and Probabilistic Stability of Stochastic Gradient
  Descent
Type-II Saddles and Probabilistic Stability of Stochastic Gradient Descent
Liu Ziyin
Botao Li
Tomer Galanti
Masakuni Ueda
32
7
0
23 Mar 2023
Large Learning Rate Tames Homogeneity: Convergence and Balancing Effect
Large Learning Rate Tames Homogeneity: Convergence and Balancing Effect
Yuqing Wang
Minshuo Chen
T. Zhao
Molei Tao
AI4CE
55
40
0
07 Oct 2021
Neural Mechanics: Symmetry and Broken Conservation Laws in Deep Learning
  Dynamics
Neural Mechanics: Symmetry and Broken Conservation Laws in Deep Learning Dynamics
D. Kunin
Javier Sagastuy-Breña
Surya Ganguli
Daniel L. K. Yamins
Hidenori Tanaka
99
77
0
08 Dec 2020
On Large-Batch Training for Deep Learning: Generalization Gap and Sharp
  Minima
On Large-Batch Training for Deep Learning: Generalization Gap and Sharp Minima
N. Keskar
Dheevatsa Mudigere
J. Nocedal
M. Smelyanskiy
P. T. P. Tang
ODL
273
2,886
0
15 Sep 2016
1