ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2105.09080
  4. Cited By
Accelerating Gossip SGD with Periodic Global Averaging

Accelerating Gossip SGD with Periodic Global Averaging

19 May 2021
Yiming Chen
Kun Yuan
Yingya Zhang
Pan Pan
Yinghui Xu
W. Yin
ArXivPDFHTML

Papers citing "Accelerating Gossip SGD with Periodic Global Averaging"

6 / 6 papers shown
Title
EDiT: A Local-SGD-Based Efficient Distributed Training Method for Large Language Models
EDiT: A Local-SGD-Based Efficient Distributed Training Method for Large Language Models
Jialiang Cheng
Ning Gao
Yun Yue
Zhiling Ye
Jiadi Jiang
Jian Sha
OffRL
77
0
0
10 Dec 2024
Communication-Efficient Federated Optimization over Semi-Decentralized Networks
Communication-Efficient Federated Optimization over Semi-Decentralized Networks
He Wang
Yuejie Chi
FedML
24
2
0
30 Nov 2023
Beyond Exponential Graph: Communication-Efficient Topologies for
  Decentralized Learning via Finite-time Convergence
Beyond Exponential Graph: Communication-Efficient Topologies for Decentralized Learning via Finite-time Convergence
Yuki Takezawa
Ryoma Sato
Han Bao
Kenta Niwa
M. Yamada
31
9
0
19 May 2023
On the Limit Performance of Floating Gossip
On the Limit Performance of Floating Gossip
Gianluca Rizzo
Noelia Pérez Palma
M. Marsan
Vincenzo Mancuso
FedML
8
0
0
16 Feb 2023
Exponential Graph is Provably Efficient for Decentralized Deep Training
Exponential Graph is Provably Efficient for Decentralized Deep Training
Bicheng Ying
Kun Yuan
Yiming Chen
Hanbin Hu
Pan Pan
W. Yin
FedML
36
83
0
26 Oct 2021
A Unified and Refined Convergence Analysis for Non-Convex Decentralized
  Learning
A Unified and Refined Convergence Analysis for Non-Convex Decentralized Learning
Sulaiman A. Alghunaim
Kun Yuan
25
57
0
19 Oct 2021
1