ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2310.16920
  4. Cited By
Smoothed Gradient Clipping and Error Feedback for Distributed
  Optimization under Heavy-Tailed Noise

Smoothed Gradient Clipping and Error Feedback for Distributed Optimization under Heavy-Tailed Noise

25 October 2023
Shuhua Yu
D. Jakovetić
S. Kar
ArXivPDFHTML

Papers citing "Smoothed Gradient Clipping and Error Feedback for Distributed Optimization under Heavy-Tailed Noise"

2 / 2 papers shown
Title
Revisiting Gradient Clipping: Stochastic bias and tight convergence
  guarantees
Revisiting Gradient Clipping: Stochastic bias and tight convergence guarantees
Anastasia Koloskova
Hadrien Hendrikx
Sebastian U. Stich
99
48
0
02 May 2023
Taming Fat-Tailed ("Heavier-Tailed'' with Potentially Infinite Variance)
  Noise in Federated Learning
Taming Fat-Tailed ("Heavier-Tailed'' with Potentially Infinite Variance) Noise in Federated Learning
Haibo Yang
Pei-Yuan Qiu
Jia Liu
FedML
17
11
0
03 Oct 2022
1