SOTAVerified

Gaussian Processes

Gaussian Processes is a powerful framework for several machine learning tasks such as regression, classification and inference. Given a finite set of input output training data that is generated out of a fixed (but possibly unknown) function, the framework models the unknown function as a stochastic process such that the training outputs are a finite number of jointly Gaussian random variables, whose properties can then be used to infer the statistics (the mean and variance) of the function at test values of input.

Source: Sequential Randomized Matrix Factorization for Gaussian Processes: Efficient Predictions and Hyper-parameter Optimization

Papers

Showing 18011825 of 1963 papers

TitleStatusHype
Training-Free Neural Active Learning with Initialization-Robustness GuaranteesCode0
Revisiting Active Sets for Gaussian Process DecodersCode0
Revisiting the Sample Complexity of Sparse Spectrum Approximation of Gaussian ProcessesCode0
Tensor Network-Constrained Kernel Machines as Gaussian ProcessesCode0
Robust and Conjugate Gaussian Process RegressionCode0
Inference in Deep Gaussian Processes using Stochastic Gradient Hamiltonian Monte CarloCode0
Parameter Inference based on Gaussian Processes Informed by Nonlinear Partial Differential EquationsCode0
Robust and Conjugate Spatio-Temporal Gaussian ProcessesCode0
No-Regret Learning in Unknown Games with Correlated PayoffsCode0
Inferring Manifolds From Noisy Data Using Gaussian ProcessesCode0
Robust and Scalable Gaussian Process Regression and Its ApplicationsCode0
Inferring Smooth Control: Monte Carlo Posterior Policy Iteration with Gaussian ProcessesCode0
Inferring the Morphology of the Galactic Center Excess with Gaussian ProcessesCode0
Robust Bayesian Optimization via Localized Online Conformal PredictionCode0
Adversarial Robustness Guarantees for Gaussian ProcessesCode0
Warped Input Gaussian Processes for Time Series ForecastingCode0
Infinite-Horizon Gaussian ProcessesCode0
Adversarial Attacks on Gaussian Process BanditsCode0
Spatial Bayesian Neural NetworksCode0
Learning GPLVM with arbitrary kernels using the unscented transformationCode0
How Infinitely Wide Neural Networks Can Benefit from Multi-task Learning -- an Exact Macroscopic CharacterizationCode0
Infinite Width Graph Neural Networks for Node Regression/ ClassificationCode0
Compositional uncertainty in deep Gaussian processesCode0
Numerical Gaussian Processes for Time-dependent and Non-linear Partial Differential EquationsCode0
Active Learning with Gaussian Processes for High Throughput PhenotypingCode0
Show:102550
← PrevPage 73 of 79Next →

Benchmark Results

#ModelMetricClaimedVerifiedStatus
1ICKy, periodicRoot mean square error (RMSE)0.03Unverified