ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2404.02445
  4. Cited By
MOPAR: A Model Partitioning Framework for Deep Learning Inference
  Services on Serverless Platforms

MOPAR: A Model Partitioning Framework for Deep Learning Inference Services on Serverless Platforms

3 April 2024
Jiaang Duan
Shiyou Qian
Dingyu Yang
Hanwen Hu
Jian Cao
Guangtao Xue
    MoE
ArXivPDFHTML

Papers citing "MOPAR: A Model Partitioning Framework for Deep Learning Inference Services on Serverless Platforms"

2 / 2 papers shown
Title
iServe: An Intent-based Serving System for LLMs
iServe: An Intent-based Serving System for LLMs
Dimitrios Liakopoulos
Tianrui Hu
Prasoon Sinha
N. Yadwadkar
VLM
176
0
0
08 Jan 2025
Large Language Models are Zero-Shot Reasoners
Large Language Models are Zero-Shot Reasoners
Takeshi Kojima
S. Gu
Machel Reid
Yutaka Matsuo
Yusuke Iwasawa
ReLM
LRM
316
4,077
0
24 May 2022
1