ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2205.05040
  4. Cited By
A Communication-Efficient Distributed Gradient Clipping Algorithm for
  Training Deep Neural Networks

A Communication-Efficient Distributed Gradient Clipping Algorithm for Training Deep Neural Networks

10 May 2022
Mingrui Liu
Zhenxun Zhuang
Yunwei Lei
Chunyang Liao
ArXivPDFHTML

Papers citing "A Communication-Efficient Distributed Gradient Clipping Algorithm for Training Deep Neural Networks"

3 / 3 papers shown
Title
Sketched Adaptive Federated Deep Learning: A Sharp Convergence Analysis
Sketched Adaptive Federated Deep Learning: A Sharp Convergence Analysis
Zhijie Chen
Qiaobo Li
A. Banerjee
FedML
35
0
0
11 Nov 2024
An Accelerated Algorithm for Stochastic Bilevel Optimization under Unbounded Smoothness
An Accelerated Algorithm for Stochastic Bilevel Optimization under Unbounded Smoothness
Xiaochuan Gong
Jie Hao
Mingrui Liu
38
2
0
28 Sep 2024
Clip21: Error Feedback for Gradient Clipping
Clip21: Error Feedback for Gradient Clipping
Sarit Khirirat
Eduard A. Gorbunov
Samuel Horváth
Rustem Islamov
Fakhri Karray
Peter Richtárik
27
10
0
30 May 2023
1