carps: A Framework for Comparing N Hyperparameter Optimizers on M Benchmarks

Hyperparameter Optimization (HPO) is crucial to develop well-performing machine learning models. In order to ease prototyping and benchmarking of HPO methods, we propose carps, a benchmark framework for Comprehensive Automated Research Performance Studies allowing to evaluate N optimizers on M benchmark tasks. In this first release of carps, we focus on the four most important types of HPO task types: blackbox, multi-fidelity, multi-objective and multi-fidelity-multi-objective. With 3 336 tasks from 5 community benchmark collections and 28 variants of 9 optimizer families, we offer the biggest go-to library to date to evaluate and compare HPO methods. The carps framework relies on a purpose-built, lightweight interface, gluing together optimizers and benchmark tasks. It also features an analysis pipeline, facilitating the evaluation of optimizers on benchmarks. However, navigating a huge number of tasks while developing and comparing methods can be computationally infeasible. To address this, we obtain a subset of representative tasks by minimizing the star discrepancy of the subset, in the space spanned by the full set. As a result, we propose an initial subset of 10 to 30 diverse tasks for each task type, and include functionality to re-compute subsets as more benchmarks become available, enabling efficient evaluations. We also establish a first set of baseline results on these tasks as a measure for future comparisons. With carps (this https URL), we make an important step in the standardization of HPO evaluation.
View on arXiv@article{benjamins2025_2506.06143, title={ carps: A Framework for Comparing N Hyperparameter Optimizers on M Benchmarks }, author={ Carolin Benjamins and Helena Graf and Sarah Segel and Difan Deng and Tim Ruhkopf and Leona Hennig and Soham Basu and Neeratyoy Mallik and Edward Bergman and Deyao Chen and François Clément and Matthias Feurer and Katharina Eggensperger and Frank Hutter and Carola Doerr and Marius Lindauer }, journal={arXiv preprint arXiv:2506.06143}, year={ 2025 } }