ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2408.00008
  4. Cited By
ScaleLLM: A Resource-Frugal LLM Serving Framework by Optimizing
  End-to-End Efficiency

ScaleLLM: A Resource-Frugal LLM Serving Framework by Optimizing End-to-End Efficiency

23 July 2024
Yuhang Yao
Han Jin
Alay Dilipbhai Shah
Shanshan Han
Zijian Hu
Yide Ran
Dimitris Stripelis
Zhaozhuo Xu
Salman Avestimehr
Chang D. Yoo
ArXivPDFHTML

Papers citing "ScaleLLM: A Resource-Frugal LLM Serving Framework by Optimizing End-to-End Efficiency"

Title
No papers