SOTAVerified

Bayesian Optimisation

Expensive black-box functions are a common problem in many disciplines, including tuning the parameters of machine learning algorithms, robotics, and other engineering design problems. Bayesian Optimisation is a principled and efficient technique for the global optimisation of these functions. The idea behind Bayesian Optimisation is to place a prior distribution over the target function and then update that prior with a set of “true” observations of the target function by expensively evaluating it in order to produce a posterior predictive distribution. The posterior then informs where to make the next observation of the target function through the use of an acquisition function, which balances the exploitation of regions known to have good performance with the exploration of regions where there is little information about the function’s response.

Source: A Bayesian Approach for the Robust Optimisation of Expensive-to-Evaluate Functions

Papers

Showing 191200 of 221 papers

TitleStatusHype
Asynchronous Parallel Bayesian Optimisation via Thompson SamplingCode0
High-Dimensional Bayesian Optimisation with Variational Autoencoders and Deep Metric LearningCode0
Asynchronous ε-Greedy Bayesian OptimisationCode0
How Bayesian Should Bayesian Optimisation Be?Code0
Hyperparameter Learning via Distributional TransferCode0
The case for fully Bayesian optimisation in small-sample trialsCode0
Investigating Bayesian optimization for expensive-to-evaluate black box functions: Application in fluid dynamicsCode0
On the development of a practical Bayesian optimisation algorithm for expensive experiments and simulations with changing environmental conditionsCode0
Are Random Decompositions all we need in High Dimensional Bayesian Optimisation?Code0
Robust and Conjugate Gaussian Process RegressionCode0
Show:102550
← PrevPage 20 of 23Next →

No leaderboard results yet.