ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2205.09470
  4. Cited By
Nebula-I: A General Framework for Collaboratively Training Deep Learning
  Models on Low-Bandwidth Cloud Clusters

Nebula-I: A General Framework for Collaboratively Training Deep Learning Models on Low-Bandwidth Cloud Clusters

19 May 2022
Yang Xiang
Zhihua Wu
Weibao Gong
Siyu Ding
Xianjie Mo
Yuang Liu
Shuohuan Wang
Peng Liu
Yongshuai Hou
Long Li
Bin Wang
S. Shi
Yaqian Han
Yue Yu
Ge Li
Yu Sun
Yanjun Ma
Dianhai Yu
ArXivPDFHTML

Papers citing "Nebula-I: A General Framework for Collaboratively Training Deep Learning Models on Low-Bandwidth Cloud Clusters"

5 / 5 papers shown
Title
Federated Learning Challenges and Opportunities: An Outlook
Federated Learning Challenges and Opportunities: An Outlook
Jie Ding
Eric W. Tramel
Anit Kumar Sahu
Shuang Wu
Salman Avestimehr
Tao Zhang
FedML
20
53
0
01 Feb 2022
MSP: Multi-Stage Prompting for Making Pre-trained Language Models Better
  Translators
MSP: Multi-Stage Prompting for Making Pre-trained Language Models Better Translators
Zhixing Tan
Xiangwen Zhang
Shuo Wang
Yang Liu
VLM
LRM
193
51
0
13 Oct 2021
COCO-LM: Correcting and Contrasting Text Sequences for Language Model
  Pretraining
COCO-LM: Correcting and Contrasting Text Sequences for Language Model Pretraining
Yu Meng
Chenyan Xiong
Payal Bajaj
Saurabh Tiwary
Paul N. Bennett
Jiawei Han
Xia Song
114
201
0
16 Feb 2021
ERNIE-M: Enhanced Multilingual Representation by Aligning Cross-lingual
  Semantics with Monolingual Corpora
ERNIE-M: Enhanced Multilingual Representation by Aligning Cross-lingual Semantics with Monolingual Corpora
Ouyang Xuan
Shuohuan Wang
Chao Pang
Yu Sun
Hao Tian
Hua-Hong Wu
Haifeng Wang
51
100
0
31 Dec 2020
Megatron-LM: Training Multi-Billion Parameter Language Models Using
  Model Parallelism
Megatron-LM: Training Multi-Billion Parameter Language Models Using Model Parallelism
M. Shoeybi
M. Patwary
Raul Puri
P. LeGresley
Jared Casper
Bryan Catanzaro
MoE
243
1,791
0
17 Sep 2019
1