ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2407.19721
  4. Cited By
Rina: Enhancing Ring-AllReduce with In-network Aggregation in
  Distributed Model Training

Rina: Enhancing Ring-AllReduce with In-network Aggregation in Distributed Model Training

29 July 2024
Zixuan Chen
Xuandong Liu
Minglin Li
Yinfan Hu
Hao Mei
Huifeng Xing
Hao Wang
Wanxin Shi
Sen Liu
Yang Xu
ArXivPDFHTML

Papers citing "Rina: Enhancing Ring-AllReduce with In-network Aggregation in Distributed Model Training"

1 / 1 papers shown
Title
Megatron-LM: Training Multi-Billion Parameter Language Models Using
  Model Parallelism
Megatron-LM: Training Multi-Billion Parameter Language Models Using Model Parallelism
M. Shoeybi
M. Patwary
Raul Puri
P. LeGresley
Jared Casper
Bryan Catanzaro
MoE
245
1,817
0
17 Sep 2019
1