ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1906.12345
  4. Cited By
Asymptotic Network Independence in Distributed Stochastic Optimization
  for Machine Learning

Asymptotic Network Independence in Distributed Stochastic Optimization for Machine Learning

28 June 2019
Shi Pu
Alexander Olshevsky
I. Paschalidis
ArXivPDFHTML

Papers citing "Asymptotic Network Independence in Distributed Stochastic Optimization for Machine Learning"

6 / 6 papers shown
Title
CEDAS: A Compressed Decentralized Stochastic Gradient Method with
  Improved Convergence
CEDAS: A Compressed Decentralized Stochastic Gradient Method with Improved Convergence
Kun-Yen Huang
Shin-Yi Pu
30
9
0
14 Jan 2023
Heavy-Tail Phenomenon in Decentralized SGD
Heavy-Tail Phenomenon in Decentralized SGD
Mert Gurbuzbalaban
Yuanhan Hu
Umut Simsekli
Kun Yuan
Lingjiong Zhu
19
8
0
13 May 2022
Distributed Random Reshuffling over Networks
Distributed Random Reshuffling over Networks
Kun-Yen Huang
Xiao Li
Andre Milzarek
Shi Pu
Junwen Qiu
34
11
0
31 Dec 2021
S-ADDOPT: Decentralized stochastic first-order optimization over
  directed graphs
S-ADDOPT: Decentralized stochastic first-order optimization over directed graphs
Muhammad I. Qureshi
Ran Xin
S. Kar
U. Khan
15
33
0
15 May 2020
A Robust Gradient Tracking Method for Distributed Optimization over
  Directed Networks
A Robust Gradient Tracking Method for Distributed Optimization over Directed Networks
Shi Pu
19
38
0
31 Mar 2020
Swarming for Faster Convergence in Stochastic Optimization
Swarming for Faster Convergence in Stochastic Optimization
Shi Pu
Alfredo García
32
15
0
11 Jun 2018
1