Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2303.08374
Cited By
MCR-DL: Mix-and-Match Communication Runtime for Deep Learning
15 March 2023
Quentin G. Anthony
A. A. Awan
Jeff Rasley
Yuxiong He
A. Shafi
Mustafa Abduljabbar
Hari Subramoni
D. Panda
MoE
Re-assign community
ArXiv
PDF
HTML
Papers citing
"MCR-DL: Mix-and-Match Communication Runtime for Deep Learning"
4 / 4 papers shown
Title
Scalable and Efficient MoE Training for Multitask Multilingual Models
Young Jin Kim
A. A. Awan
Alexandre Muzio
Andres Felipe Cruz Salinas
Liyang Lu
Amr Hendy
Samyam Rajbhandari
Yuxiong He
Hany Awadalla
MoE
94
84
0
22 Sep 2021
ZeRO-Offload: Democratizing Billion-Scale Model Training
Jie Ren
Samyam Rajbhandari
Reza Yazdani Aminabadi
Olatunji Ruwase
Shuangyang Yang
Minjia Zhang
Dong Li
Yuxiong He
MoE
160
413
0
18 Jan 2021
The Pile: An 800GB Dataset of Diverse Text for Language Modeling
Leo Gao
Stella Biderman
Sid Black
Laurence Golding
Travis Hoppe
...
Horace He
Anish Thite
Noa Nabeshima
Shawn Presser
Connor Leahy
AIMat
248
1,986
0
31 Dec 2020
Megatron-LM: Training Multi-Billion Parameter Language Models Using Model Parallelism
M. Shoeybi
M. Patwary
Raul Puri
P. LeGresley
Jared Casper
Bryan Catanzaro
MoE
243
1,817
0
17 Sep 2019
1