ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2006.12575
  4. Cited By
LAMP: Large Deep Nets with Automated Model Parallelism for Image
  Segmentation

LAMP: Large Deep Nets with Automated Model Parallelism for Image Segmentation

22 June 2020
Wentao Zhu
Can Zhao
Wenqi Li
H. Roth
Ziyue Xu
Daguang Xu
    3DV
ArXivPDFHTML

Papers citing "LAMP: Large Deep Nets with Automated Model Parallelism for Image Segmentation"

7 / 7 papers shown
Title
Systems for Parallel and Distributed Large-Model Deep Learning Training
Systems for Parallel and Distributed Large-Model Deep Learning Training
Kabir Nagrecha
GNN
VLM
MoE
23
7
0
06 Jan 2023
Open-Source Skull Reconstruction with MONAI
Open-Source Skull Reconstruction with MONAI
Jianning Li
André Ferreira
B. Puladi
Victor Alves
Michael Kamp
Moon S. Kim
F. Nensa
Jens Kleesiek
Seyed-Ahmad Ahmadi
Jan Egger
16
1
0
25 Nov 2022
LOFT: Finding Lottery Tickets through Filter-wise Training
LOFT: Finding Lottery Tickets through Filter-wise Training
Qihan Wang
Chen Dun
Fangshuo Liao
C. Jermaine
Anastasios Kyrillidis
16
3
0
28 Oct 2022
ResIST: Layer-Wise Decomposition of ResNets for Distributed Training
ResIST: Layer-Wise Decomposition of ResNets for Distributed Training
Chen Dun
Cameron R. Wolfe
C. Jermaine
Anastasios Kyrillidis
16
21
0
02 Jul 2021
Test-Time Training for Deformable Multi-Scale Image Registration
Test-Time Training for Deformable Multi-Scale Image Registration
Wentao Zhu
Yufang Huang
Daguang Xu
Zhen Qian
Wei Fan
Xiaohui Xie
11
24
0
25 Mar 2021
GIST: Distributed Training for Large-Scale Graph Convolutional Networks
GIST: Distributed Training for Large-Scale Graph Convolutional Networks
Cameron R. Wolfe
Jingkang Yang
Arindam Chowdhury
Chen Dun
Artun Bayer
Santiago Segarra
Anastasios Kyrillidis
BDL
GNN
LRM
39
9
0
20 Feb 2021
Megatron-LM: Training Multi-Billion Parameter Language Models Using
  Model Parallelism
Megatron-LM: Training Multi-Billion Parameter Language Models Using Model Parallelism
M. Shoeybi
M. Patwary
Raul Puri
P. LeGresley
Jared Casper
Bryan Catanzaro
MoE
243
1,817
0
17 Sep 2019
1