ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2103.08894
  4. Cited By
Distributed Deep Learning Using Volunteer Computing-Like Paradigm

Distributed Deep Learning Using Volunteer Computing-Like Paradigm

16 March 2021
Medha Atre
B. Jha
Ashwini Rao
ArXivPDFHTML

Papers citing "Distributed Deep Learning Using Volunteer Computing-Like Paradigm"

3 / 3 papers shown
Title
SWARM Parallelism: Training Large Models Can Be Surprisingly
  Communication-Efficient
SWARM Parallelism: Training Large Models Can Be Surprisingly Communication-Efficient
Max Ryabinin
Tim Dettmers
Michael Diskin
Alexander Borzunov
MoE
22
31
0
27 Jan 2023
Productivity, Portability, Performance: Data-Centric Python
Productivity, Portability, Performance: Data-Centric Python
Yiheng Wang
Yao Zhang
Yanzhang Wang
Yan Wan
Jiao Wang
Zhongyuan Wu
Yuhao Yang
Bowen She
52
94
0
01 Jul 2021
JSDoop and TensorFlow.js: Volunteer Distributed Web Browser-Based Neural
  Network Training
JSDoop and TensorFlow.js: Volunteer Distributed Web Browser-Based Neural Network Training
José Á. Morell
Andrés Camero
Enrique Alba
16
9
0
12 Oct 2019
1