ResearchTrend.AI
  • Communities
  • Connect sessions
  • AI calendar
  • Organizations
  • Join Slack
  • Contact Sales
Papers
Communities
Social Events
Terms and Conditions
Pricing
Contact Sales
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2026 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2306.16926
  4. Cited By
OSP: Boosting Distributed Model Training with 2-stage Synchronization
v1v2 (latest)

OSP: Boosting Distributed Model Training with 2-stage Synchronization

International Conference on Parallel Processing (ICPP), 2023
29 June 2023
Zixuan Chen
Lei Shi
Xuandong Liu
Jiahui Li
Sen Liu
Yang Xu
ArXiv (abs)PDFHTML

Papers citing "OSP: Boosting Distributed Model Training with 2-stage Synchronization"

2 / 2 papers shown
Rina: Enhancing Ring-AllReduce with In-network Aggregation in
  Distributed Model Training
Rina: Enhancing Ring-AllReduce with In-network Aggregation in Distributed Model TrainingIEEE International Conference on Network Protocols (ICNP), 2024
Zixuan Chen
Xuandong Liu
Minglin Li
Yinfan Hu
Hao Mei
Huifeng Xing
Hao Wang
Wanxin Shi
Sen Liu
Yang Xu
255
5
0
29 Jul 2024
Communication-Efficient Large-Scale Distributed Deep Learning: A
  Comprehensive Survey
Communication-Efficient Large-Scale Distributed Deep Learning: A Comprehensive Survey
Feng Liang
Zhen Zhang
Haifeng Lu
Victor C. M. Leung
Yanyi Guo
Xiping Hu
GNN
350
24
0
09 Apr 2024
1