ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2010.15162
51
88
v1v2v3 (latest)

Sizeless: Predicting the optimal size of serverless functions

28 October 2020
Simon Eismann
Long Bui
Johannes Grohmann
Cristina L. Abad
N. Herbst
Samuel Kounev
ArXiv (abs)PDFHTML
Abstract

Serverless functions are a cloud computing paradigm that reduces operational overheads for developers, because the cloud provider takes care of resource management tasks such as resource provisioning, deployment, and auto-scaling. The only resource management task that developers are still in charge of is resource sizing, that is, selecting how much resources are allocated to each worker instance. However, due to the challenging nature of resource sizing, developers often neglect it despite its significant cost and performance benefits. Existing approaches aiming to automate serverless functions resource sizing require dedicated performance tests, which are time consuming to implement and maintain. In this paper, we introduce Sizeless -- an approach to predict the optimal resource size of a serverless function using monitoring data from a single resource size. As our approach requires only production monitoring data, developers no longer need to implement and maintain representative performance tests. Furthermore, it enables cloud providers, which cannot engage in testing the performance of user functions, to implement resource sizing on a platform level and automate the last resource management task associated with serverless functions. In our evaluation, Sizeless was able to predict the execution time of the serverless functions of a realistic server-less application with a median prediction accuracy of 93.1%. Using Sizeless to optimize the memory size of this application results in a speedup of 16.7% while simultaneously decreasing costs by 2.5%.

View on arXiv
Comments on this paper