SOTAVerified

Tight Regret Bounds for Bayesian Optimization in One Dimension

2018-05-30ICML 2018Unverified0· sign in to hype

Jonathan Scarlett

Unverified — Be the first to reproduce this paper.

Reproduce

Abstract

We consider the problem of Bayesian optimization (BO) in one dimension, under a Gaussian process prior and Gaussian sampling noise. We provide a theoretical analysis showing that, under fairly mild technical assumptions on the kernel, the best possible cumulative regret up to time T behaves as (T) and O(T T). This gives a tight characterization up to a T factor, and includes the first non-trivial lower bound for noisy BO. Our assumptions are satisfied, for example, by the squared exponential and Mat\'ern- kernels, with the latter requiring > 2. Our results certify the near-optimality of existing bounds (Srinivas et al., 2009) for the SE kernel, while proving them to be strictly suboptimal for the Mat\'ern kernel with > 2.

Tasks

Reproductions