28
2

Open Loop Hyperparameter Optimization and Determinantal Point Processes

Abstract

Driven by the need for parallelizable hyperparameter optimization methods, this paper studies \emph{open loop} search methods: sequences that are predetermined and can be generated before a single configuration is evaluated. Examples include grid search, uniform random search, low discrepancy sequences, and other sampling distributions. In particular, we propose the use of kk-determinantal point processes in hyperparameter optimization via random search. Compared to conventional uniform random search where hyperparameter settings are sampled independently, a kk-DPP promotes diversity. We describe an approach that transforms hyperparameter search spaces for efficient use with a kk-DPP. In addition, we introduce a novel Metropolis-Hastings algorithm which can sample from kk-DPPs defined over any space from which uniform samples can be drawn, including spaces with a mixture of discrete and continuous dimensions or tree structure. Our experiments show significant benefits in realistic scenarios with a limited budget for training supervised learners, whether in serial or parallel.

View on arXiv
Comments on this paper