ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2203.06638
  4. Cited By
Scaling the Wild: Decentralizing Hogwild!-style Shared-memory SGD

Scaling the Wild: Decentralizing Hogwild!-style Shared-memory SGD

13 March 2022
Bapi Chatterjee
Vyacheslav Kungurtsev
Dan Alistarh
    FedML
ArXivPDFHTML

Papers citing "Scaling the Wild: Decentralizing Hogwild!-style Shared-memory SGD"

2 / 2 papers shown
Title
Asynchronous Stochastic Gradient Descent with Decoupled Backpropagation and Layer-Wise Updates
Asynchronous Stochastic Gradient Descent with Decoupled Backpropagation and Layer-Wise Updates
Cabrel Teguemne Fokam
Khaleelulla Khan Nazeer
Lukas König
David Kappel
Anand Subramoney
28
0
0
08 Oct 2024
On Large-Batch Training for Deep Learning: Generalization Gap and Sharp
  Minima
On Large-Batch Training for Deep Learning: Generalization Gap and Sharp Minima
N. Keskar
Dheevatsa Mudigere
J. Nocedal
M. Smelyanskiy
P. T. P. Tang
ODL
281
2,889
0
15 Sep 2016
1