SOTAVerified

Bayesian Optimisation

Expensive black-box functions are a common problem in many disciplines, including tuning the parameters of machine learning algorithms, robotics, and other engineering design problems. Bayesian Optimisation is a principled and efficient technique for the global optimisation of these functions. The idea behind Bayesian Optimisation is to place a prior distribution over the target function and then update that prior with a set of “true” observations of the target function by expensively evaluating it in order to produce a posterior predictive distribution. The posterior then informs where to make the next observation of the target function through the use of an acquisition function, which balances the exploitation of regions known to have good performance with the exploration of regions where there is little information about the function’s response.

Source: A Bayesian Approach for the Robust Optimisation of Expensive-to-Evaluate Functions

Papers

Showing 4150 of 221 papers

TitleStatusHype
Adaptive Batch Sizes for Active Learning A Probabilistic Numerics ApproachCode0
Bayesian optimisation for fast approximate inference in state-space models with intractable likelihoodsCode0
A penalisation method for batch multi-objective Bayesian optimisation with application in heat exchanger designCode0
Effective Estimation of Deep Generative Language ModelsCode0
Distributional Bayesian optimisation for variational inference on black-box simulatorsCode0
Asynchronous Parallel Bayesian Optimisation via Thompson SamplingCode0
Data-driven Prior Learning for Bayesian OptimisationCode0
Bayesian Optimisation Against Climate Change: Applications and BenchmarksCode0
Asynchronous ε-Greedy Bayesian OptimisationCode0
Detection and classification of vocal productions in large scale audio recordingsCode0
Show:102550
← PrevPage 5 of 23Next →

No leaderboard results yet.