ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2503.22478
41
0

Almost Bayesian: The Fractal Dynamics of Stochastic Gradient Descent

28 March 2025
Max Hennick
Stijn De Baerdemacker
ArXivPDFHTML
Abstract

We show that the behavior of stochastic gradient descent is related to Bayesian statistics by showing that SGD is effectively diffusion on a fractal landscape, where the fractal dimension can be accounted for in a purely Bayesian way. By doing this we show that SGD can be regarded as a modified Bayesian sampler which accounts for accessibility constraints induced by the fractal structure of the loss landscape. We verify our results experimentally by examining the diffusion of weights during training. These results offer insight into the factors which determine the learning process, and seemingly answer the question of how SGD and purely Bayesian sampling are related.

View on arXiv
@article{hennick2025_2503.22478,
  title={ Almost Bayesian: The Fractal Dynamics of Stochastic Gradient Descent },
  author={ Max Hennick and Stijn De Baerdemacker },
  journal={arXiv preprint arXiv:2503.22478},
  year={ 2025 }
}
Comments on this paper