ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2302.12803
  4. Cited By
PiPar: Pipeline Parallelism for Collaborative Machine Learning

PiPar: Pipeline Parallelism for Collaborative Machine Learning

1 December 2022
Zihan Zhang
Philip Rodgers
Peter Kilpatrick
I. Spence
Blesson Varghese
    FedML
ArXivPDFHTML

Papers citing "PiPar: Pipeline Parallelism for Collaborative Machine Learning"

4 / 4 papers shown
Title
EcoFed: Efficient Communication for DNN Partitioning-based Federated
  Learning
EcoFed: Efficient Communication for DNN Partitioning-based Federated Learning
Di Wu
R. Ullah
Philip Rodgers
Peter Kilpatrick
I. Spence
Blesson Varghese
FedML
27
1
0
11 Apr 2023
Chimera: Efficiently Training Large-Scale Neural Networks with
  Bidirectional Pipelines
Chimera: Efficiently Training Large-Scale Neural Networks with Bidirectional Pipelines
Shigang Li
Torsten Hoefler
GNN
AI4CE
LRM
77
131
0
14 Jul 2021
On Large-Batch Training for Deep Learning: Generalization Gap and Sharp
  Minima
On Large-Batch Training for Deep Learning: Generalization Gap and Sharp Minima
N. Keskar
Dheevatsa Mudigere
J. Nocedal
M. Smelyanskiy
P. T. P. Tang
ODL
273
2,886
0
15 Sep 2016
ImageNet Large Scale Visual Recognition Challenge
ImageNet Large Scale Visual Recognition Challenge
Olga Russakovsky
Jia Deng
Hao Su
J. Krause
S. Satheesh
...
A. Karpathy
A. Khosla
Michael S. Bernstein
Alexander C. Berg
Li Fei-Fei
VLM
ObjD
282
39,190
0
01 Sep 2014
1