SOTAVerified

Bayesian Optimisation

Expensive black-box functions are a common problem in many disciplines, including tuning the parameters of machine learning algorithms, robotics, and other engineering design problems. Bayesian Optimisation is a principled and efficient technique for the global optimisation of these functions. The idea behind Bayesian Optimisation is to place a prior distribution over the target function and then update that prior with a set of “true” observations of the target function by expensively evaluating it in order to produce a posterior predictive distribution. The posterior then informs where to make the next observation of the target function through the use of an acquisition function, which balances the exploitation of regions known to have good performance with the exploration of regions where there is little information about the function’s response.

Source: A Bayesian Approach for the Robust Optimisation of Expensive-to-Evaluate Functions

Papers

Showing 181190 of 221 papers

TitleStatusHype
Batch Bayesian Optimization via Particle Gradient FlowsCode0
Batch Bayesian Optimization via Local PenalizationCode0
GPflowOpt: A Bayesian Optimization Library using TensorFlowCode0
Neural Architecture Search with Bayesian Optimisation and Optimal TransportCode0
Batch Bayesian optimisation via density-ratio estimation with guaranteesCode0
Greed is Good: Exploration and Exploitation Trade-offs in Bayesian OptimisationCode0
HEBO Pushing The Limits of Sample-Efficient Hyperparameter OptimisationCode0
Neuroadaptive electroencephalography: a proof-of-principle study in infantsCode0
Nonmyopic Global Optimisation via Approximate Dynamic ProgrammingCode0
Automated Machine Learning for Positive-Unlabelled LearningCode0
Show:102550
← PrevPage 19 of 23Next →

No leaderboard results yet.