ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1912.06036
  4. Cited By
Parallel Restarted SPIDER -- Communication Efficient Distributed
  Nonconvex Optimization with Optimal Computation Complexity

Parallel Restarted SPIDER -- Communication Efficient Distributed Nonconvex Optimization with Optimal Computation Complexity

12 December 2019
Pranay Sharma
Swatantra Kafle
Prashant Khanduri
Saikiran Bulusu
K. Rajawat
P. Varshney
    FedML
ArXivPDFHTML

Papers citing "Parallel Restarted SPIDER -- Communication Efficient Distributed Nonconvex Optimization with Optimal Computation Complexity"

4 / 4 papers shown
Title
FedDRO: Federated Compositional Optimization for Distributionally Robust
  Learning
FedDRO: Federated Compositional Optimization for Distributionally Robust Learning
Prashant Khanduri
Chengyin Li
Rafi Ibn Sultan
Yao Qiang
Joerg Kliewer
Dongxiao Zhu
34
0
0
21 Nov 2023
MARINA: Faster Non-Convex Distributed Learning with Compression
MARINA: Faster Non-Convex Distributed Learning with Compression
Eduard A. Gorbunov
Konstantin Burlachenko
Zhize Li
Peter Richtárik
34
108
0
15 Feb 2021
FedPD: A Federated Learning Framework with Optimal Rates and Adaptivity
  to Non-IID Data
FedPD: A Federated Learning Framework with Optimal Rates and Adaptivity to Non-IID Data
Xinwei Zhang
Mingyi Hong
S. Dhople
W. Yin
Yang Liu
FedML
21
227
0
22 May 2020
Optimal Distributed Online Prediction using Mini-Batches
Optimal Distributed Online Prediction using Mini-Batches
O. Dekel
Ran Gilad-Bachrach
Ohad Shamir
Lin Xiao
171
683
0
07 Dec 2010
1