ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2310.18784
  4. Cited By
High-probability Convergence Bounds for Nonlinear Stochastic Gradient
  Descent Under Heavy-tailed Noise

High-probability Convergence Bounds for Nonlinear Stochastic Gradient Descent Under Heavy-tailed Noise

28 October 2023
Aleksandar Armacki
Pranay Sharma
Gauri Joshi
Dragana Bajović
D. Jakovetić
S. Kar
ArXivPDFHTML

Papers citing "High-probability Convergence Bounds for Nonlinear Stochastic Gradient Descent Under Heavy-tailed Noise"

5 / 5 papers shown
Title
From Gradient Clipping to Normalization for Heavy Tailed SGD
From Gradient Clipping to Normalization for Heavy Tailed SGD
Florian Hübler
Ilyas Fatkhullin
Niao He
40
5
0
17 Oct 2024
Differential Private Stochastic Optimization with Heavy-tailed Data:
  Towards Optimal Rates
Differential Private Stochastic Optimization with Heavy-tailed Data: Towards Optimal Rates
Puning Zhao
Jiafei Wu
Zhe Liu
Chong Wang
Rongfei Fan
Qingming Li
40
1
0
19 Aug 2024
Gradient Based Clustering
Gradient Based Clustering
Aleksandar Armacki
Dragana Bajović
D. Jakovetić
S. Kar
11
8
0
01 Feb 2022
Personalized Federated Learning via Convex Clustering
Personalized Federated Learning via Convex Clustering
Aleksandar Armacki
Dragana Bajović
D. Jakovetić
S. Kar
FedML
11
15
0
01 Feb 2022
A High Probability Analysis of Adaptive SGD with Momentum
A High Probability Analysis of Adaptive SGD with Momentum
Xiaoyun Li
Francesco Orabona
79
64
0
28 Jul 2020
1