SOTAVerified

Bayesian Optimisation

Expensive black-box functions are a common problem in many disciplines, including tuning the parameters of machine learning algorithms, robotics, and other engineering design problems. Bayesian Optimisation is a principled and efficient technique for the global optimisation of these functions. The idea behind Bayesian Optimisation is to place a prior distribution over the target function and then update that prior with a set of “true” observations of the target function by expensively evaluating it in order to produce a posterior predictive distribution. The posterior then informs where to make the next observation of the target function through the use of an acquisition function, which balances the exploitation of regions known to have good performance with the exploration of regions where there is little information about the function’s response.

Source: A Bayesian Approach for the Robust Optimisation of Expensive-to-Evaluate Functions

Papers

Showing 161170 of 221 papers

TitleStatusHype
Interactive Text Ranking with Bayesian Optimisation: A Case Study on Community QA and SummarisationCode0
Parameter Optimization and Learning in a Spiking Neural Network for UAV Obstacle Avoidance targeting Neuromorphic Processors0
Achieving Robustness to Aleatoric Uncertainty with Heteroscedastic Bayesian OptimisationCode1
Distributional Bayesian optimisation for variational inference on black-box simulatorsCode0
BoTorch: A Framework for Efficient Monte-Carlo Bayesian OptimizationCode2
Batch simulations and uncertainty quantification in Gaussian process surrogate approximate Bayesian computation0
Optimal experimental design via Bayesian optimization: active causal structure learning for Gaussian process networks0
Antifragile and Robust Heteroscedastic Bayesian Optimisation0
Bayesian Optimisation with Gaussian Processes for Premise Selection0
Cost-aware Multi-objective Bayesian optimisation0
Show:102550
← PrevPage 17 of 23Next →

No leaderboard results yet.