ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2306.13835
  4. Cited By
Computron: Serving Distributed Deep Learning Models with Model Parallel
  Swapping

Computron: Serving Distributed Deep Learning Models with Model Parallel Swapping

24 June 2023
Daniel Zou
X. Jin
Xueyang Yu
Haotian Zhang
J. Demmel
    MoE
ArXivPDFHTML

Papers citing "Computron: Serving Distributed Deep Learning Models with Model Parallel Swapping"

2 / 2 papers shown
Title
EnergonAI: An Inference System for 10-100 Billion Parameter Transformer
  Models
EnergonAI: An Inference System for 10-100 Billion Parameter Transformer Models
Jiangsu Du
Ziming Liu
Jiarui Fang
Shenggui Li
Yongbin Li
Yutong Lu
Yang You
MoE
27
4
0
06 Sep 2022
Training language models to follow instructions with human feedback
Training language models to follow instructions with human feedback
Long Ouyang
Jeff Wu
Xu Jiang
Diogo Almeida
Carroll L. Wainwright
...
Amanda Askell
Peter Welinder
Paul Christiano
Jan Leike
Ryan J. Lowe
OSLM
ALM
306
11,909
0
04 Mar 2022
1