58
11

Regret Analysis of Global Optimization in Univariate Functions with Lipschitz Derivatives

Abstract

In this work, we study the problem of global optimization in univariate loss functions, where we analyze the regret of the popular lower bounding algorithms (e.g., Piyavskii-Shubert algorithm). For any given time TT, instead of the widely available simple regret (which is the difference of the losses between the best estimation up to TT and the global optimizer), we study the cumulative regret up to that time. With a suitable lower bounding algorithm, we show that it is possible to achieve satisfactory cumulative regret bounds for different classes of functions. For Lipschitz continuous functions with the parameter LL, we show that the cumulative regret is O(LlogT)O(L\log T). For Lipschitz smooth functions with the parameter HH, we show that the cumulative regret is O(H)O(H). We also analytically extend our results for a broader class of functions that covers both the Lipschitz continuous and smooth functions individually.

View on arXiv
Comments on this paper