SOTAVerified

Gaussian Processes

Gaussian Processes is a powerful framework for several machine learning tasks such as regression, classification and inference. Given a finite set of input output training data that is generated out of a fixed (but possibly unknown) function, the framework models the unknown function as a stochastic process such that the training outputs are a finite number of jointly Gaussian random variables, whose properties can then be used to infer the statistics (the mean and variance) of the function at test values of input.

Source: Sequential Randomized Matrix Factorization for Gaussian Processes: Efficient Predictions and Hyper-parameter Optimization

Papers

Showing 17511775 of 1963 papers

TitleStatusHype
Task Diversity in Bayesian Federated Learning: Simultaneous Processing of Classification and RegressionCode0
Hierarchical-Hyperplane Kernels for Actively Learning Gaussian Process Models of Nonstationary SystemsCode0
Hierarchical Inducing Point Gaussian Process for Inter-domain ObservationsCode0
Neural Operator Variational Inference based on Regularized Stein Discrepancy for Deep Gaussian ProcessesCode0
Neural signature kernels as infinite-width-depth-limits of controlled ResNetsCode0
Deterministic error bounds for kernel-based learning techniques under bounded noiseCode0
Detecting Misclassification Errors in Neural Networks with a Gaussian Process ModelCode0
Variational Bayesian Multiple Instance Learning With Gaussian ProcessesCode0
Deep Variational Implicit ProcessesCode0
Deep Structured Mixtures of Gaussian ProcessesCode0
Considering discrepancy when calibrating a mechanistic electrophysiology modelCode0
Hodge-Compositional Edge Gaussian ProcessesCode0
How Bayesian Should Bayesian Optimisation Be?Code0
Adjusting Model Size in Continual Gaussian Processes: How Big is Big Enough?Code0
Taylorformer: Probabilistic Modelling for Random Processes including Time SeriesCode0
Adversarial Robustness Guarantees for Random Deep Neural NetworksCode0
Bayesian optimization of atomic structures with prior probabilities from universal interatomic potentialsCode0
Hybrid Parameter Search and Dynamic Model Selection for Mixed-Variable Bayesian OptimizationCode0
Hyperbolic Secant representation of the logistic function: Application to probabilistic Multiple Instance Learning for CT intracranial hemorrhage detectionCode0
On the Estimation of Derivatives Using Plug-in Kernel Ridge Regression EstimatorsCode0
HyperBO+: Pre-training a universal prior for Bayesian optimization with hierarchical Gaussian processesCode0
Hyper-parameter tuning of physics-informed neural networks: Application to Helmholtz problemsCode0
Reliable training and estimation of variance networksCode0
Non-Euclidean Universal ApproximationCode0
Conditionally Independent Multiresolution Gaussian ProcessesCode0
Show:102550
← PrevPage 71 of 79Next →

Benchmark Results

#ModelMetricClaimedVerifiedStatus
1ICKy, periodicRoot mean square error (RMSE)0.03Unverified