Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2205.11361
Cited By
Chaotic Regularization and Heavy-Tailed Limits for Deterministic Gradient Descent
23 May 2022
S. H. Lim
Yijun Wan
Umut cSimcsekli
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Chaotic Regularization and Heavy-Tailed Limits for Deterministic Gradient Descent"
6 / 6 papers shown
Title
Generalization Guarantees for Multi-View Representation Learning and Application to Regularization via Gaussian Product Mixture Prior
Milad Sefidgaran
Abdellatif Zaidi
Piotr Krasnowski
44
0
0
25 Apr 2025
Generalization Guarantees for Representation Learning via Data-Dependent Gaussian Mixture Priors
Milad Sefidgaran
A. Zaidi
Piotr Krasnowski
80
1
0
21 Feb 2025
Privacy of SGD under Gaussian or Heavy-Tailed Noise: Guarantees without Gradient Clipping
Umut Simsekli
Mert Gurbuzbalaban
S. Yıldırım
Lingjiong Zhu
30
2
0
04 Mar 2024
Algorithmic Stability of Heavy-Tailed SGD with General Loss Functions
Anant Raj
Lingjiong Zhu
Mert Gurbuzbalaban
Umut Simsekli
21
15
0
27 Jan 2023
Stochastic Training is Not Necessary for Generalization
Jonas Geiping
Micah Goldblum
Phillip E. Pope
Michael Moeller
Tom Goldstein
81
72
0
29 Sep 2021
On Large-Batch Training for Deep Learning: Generalization Gap and Sharp Minima
N. Keskar
Dheevatsa Mudigere
J. Nocedal
M. Smelyanskiy
P. T. P. Tang
ODL
273
2,878
0
15 Sep 2016
1