ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1906.11152
32
2
v1v2v3v4 (latest)

Modulating Surrogates for Bayesian Optimization

26 June 2019
Erik Bodin
Markus Kaiser
Ieva Kazlauskaite
Zhenwen Dai
Neill D. F. Campbell
Carl Henrik Ek
ArXiv (abs)PDFHTML
Abstract

Bayesian optimization (BO) methods often rely on the assumption that the objective function is well-behaved, but in practice the objective functions are seldom well-behaved even if noise-free observations can be collected. We propose to address the issue by focusing on the well-behaved structure informative for search while ignoring detrimental structure that is challenging to model data efficiently. We use a noise distribution to absorb the challenging details by treating them as irreducible uncertainty. In particular we use a latent Gaussian process as the surrogate model, in which a latent variable is introduced to the input of a Gaussian process and serves as a noise variable. It allows the noise distribution to be non-stationary and non-Gaussian. With experiments on a range of BO benchmarks, we show that our method significantly outperform existing methods. Keywords: robust surrogate models, Bayesian Optimization, nonsmooth objective functions, Latent Gaussian Processes

View on arXiv
Comments on this paper