ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1704.00963
69
26
v1v2v3 (latest)

Bayesian optimization with virtual derivative sign observations

4 April 2017
E. Siivola
Aki Vehtari
J. Vanhatalo
Javier I. González
ArXiv (abs)PDFHTML
Abstract

Bayesian optimization (BO) is a global optimization strategy designed to find the minimum of expensive black-box functions ggg typically defined on a continuous sets of Rd\mathcal{R}^dRd. Using a Gaussian process (GP) as a surrogate model for the objective and an acquisition function to systematically search its domain, BO strategies aim to minimize the amount of samples required to find the minimum of ggg. Although currently available acquisition functions address this goal with different degree of success, an over-exploration effect of the contour of ggg is typically observed. This is due to the myopic nature of most acquisitions that greedily try to over-reduce uncertainty in the border of the search domain. In most real problems, however, like the configuration of machine learning algorithms, the function domain is conservatively large and with a high probability the global minimum is not at the boundary. We propose a method to incorporate this knowledge into the searching process by adding virtual derivative observations at the borders of the search space. We use the properties of GP models that allow us to easily impose conditions on the partial derivatives of the objective. The method is applicable with any acquisition function, it is easy to use and consistently reduces the number of evaluations required to find the minimum of ggg irrespective of the acquisition used. We illustrate the benefits our approach in a simulation study with a battery of objective functions.

View on arXiv
Comments on this paper