ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2012.08489
8
16

Amazon SageMaker Automatic Model Tuning: Scalable Gradient-Free Optimization

15 December 2020
Valerio Perrone
Huibin Shen
Aida Zolic
I. Shcherbatyi
Amr Ahmed
Tanya Bansal
Michele Donini
Fela Winkelmolen
Rodolphe Jenatton
Jean Baptiste Faddoul
Barbara Pogorzelska
Miroslav Miladinovic
K. Kenthapadi
Matthias Seeger
Cédric Archambeau
ArXivPDFHTML
Abstract

Tuning complex machine learning systems is challenging. Machine learning typically requires to set hyperparameters, be it regularization, architecture, or optimization parameters, whose tuning is critical to achieve good predictive performance. To democratize access to machine learning systems, it is essential to automate the tuning. This paper presents Amazon SageMaker Automatic Model Tuning (AMT), a fully managed system for gradient-free optimization at scale. AMT finds the best version of a trained machine learning model by repeatedly evaluating it with different hyperparameter configurations. It leverages either random search or Bayesian optimization to choose the hyperparameter values resulting in the best model, as measured by the metric chosen by the user. AMT can be used with built-in algorithms, custom algorithms, and Amazon SageMaker pre-built containers for machine learning frameworks. We discuss the core functionality, system architecture, our design principles, and lessons learned. We also describe more advanced features of AMT, such as automated early stopping and warm-starting, showing in experiments their benefits to users.

View on arXiv
Comments on this paper