ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2104.08364
  4. Cited By
Sync-Switch: Hybrid Parameter Synchronization for Distributed Deep
  Learning

Sync-Switch: Hybrid Parameter Synchronization for Distributed Deep Learning

16 April 2021
Shijian Li
Oren Mangoubi
Lijie Xu
Tian Guo
ArXivPDFHTML

Papers citing "Sync-Switch: Hybrid Parameter Synchronization for Distributed Deep Learning"

3 / 3 papers shown
Title
FuncPipe: A Pipelined Serverless Framework for Fast and Cost-efficient
  Training of Deep Learning Models
FuncPipe: A Pipelined Serverless Framework for Fast and Cost-efficient Training of Deep Learning Models
Yunzhuo Liu
Bo Jiang
Tian Guo
Zimeng Huang
Wen-ping Ma
Xinbing Wang
Chenghu Zhou
19
9
0
28 Apr 2022
Quantifying and Improving Performance of Distributed Deep Learning with
  Cloud Storage
Quantifying and Improving Performance of Distributed Deep Learning with Cloud Storage
Nicholas Krichevsky
M. S. Louis
Tian Guo
22
9
0
13 Aug 2021
On Large-Batch Training for Deep Learning: Generalization Gap and Sharp
  Minima
On Large-Batch Training for Deep Learning: Generalization Gap and Sharp Minima
N. Keskar
Dheevatsa Mudigere
J. Nocedal
M. Smelyanskiy
P. T. P. Tang
ODL
281
2,888
0
15 Sep 2016
1