ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2105.04332
16
3

Bayesian Optimistic Optimisation with Exponentially Decaying Regret

10 May 2021
Hung The Tran
Sunil R. Gupta
Santu Rana
Svetha Venkatesh
ArXivPDFHTML
Abstract

Bayesian optimisation (BO) is a well-known efficient algorithm for finding the global optimum of expensive, black-box functions. The current practical BO algorithms have regret bounds ranging from O(logNN)\mathcal{O}(\frac{logN}{\sqrt{N}})O(N​logN​) to O(e−N)\mathcal O(e^{-\sqrt{N}})O(e−N​), where NNN is the number of evaluations. This paper explores the possibility of improving the regret bound in the noiseless setting by intertwining concepts from BO and tree-based optimistic optimisation which are based on partitioning the search space. We propose the BOO algorithm, a first practical approach which can achieve an exponential regret bound with order O(N−N)\mathcal O(N^{-\sqrt{N}})O(N−N​) under the assumption that the objective function is sampled from a Gaussian process with a Mat\érn kernel with smoothness parameter ν>4+D2\nu > 4 +\frac{D}{2}ν>4+2D​, where DDD is the number of dimensions. We perform experiments on optimisation of various synthetic functions and machine learning hyperparameter tuning tasks and show that our algorithm outperforms baselines.

View on arXiv
Comments on this paper