ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2408.12596
  4. Cited By
Poplar: Efficient Scaling of Distributed DNN Training on Heterogeneous
  GPU Clusters

Poplar: Efficient Scaling of Distributed DNN Training on Heterogeneous GPU Clusters

22 August 2024
WenZheng Zhang
Yang Hu
Jing Shi
Xiaoying Bai
ArXivPDFHTML

Papers citing "Poplar: Efficient Scaling of Distributed DNN Training on Heterogeneous GPU Clusters"

1 / 1 papers shown
Title
A Comparison of Optimization Algorithms for Deep Learning
A Comparison of Optimization Algorithms for Deep Learning
Derya Soydaner
85
151
0
28 Jul 2020
1