SOTAVerified

Gaussian Processes

Gaussian Processes is a powerful framework for several machine learning tasks such as regression, classification and inference. Given a finite set of input output training data that is generated out of a fixed (but possibly unknown) function, the framework models the unknown function as a stochastic process such that the training outputs are a finite number of jointly Gaussian random variables, whose properties can then be used to infer the statistics (the mean and variance) of the function at test values of input.

Source: Sequential Randomized Matrix Factorization for Gaussian Processes: Efficient Predictions and Hyper-parameter Optimization

Papers

Showing 17511800 of 1963 papers

TitleStatusHype
Task Diversity in Bayesian Federated Learning: Simultaneous Processing of Classification and RegressionCode0
Hierarchical-Hyperplane Kernels for Actively Learning Gaussian Process Models of Nonstationary SystemsCode0
Hierarchical Inducing Point Gaussian Process for Inter-domain ObservationsCode0
Neural Operator Variational Inference based on Regularized Stein Discrepancy for Deep Gaussian ProcessesCode0
Neural signature kernels as infinite-width-depth-limits of controlled ResNetsCode0
Deterministic error bounds for kernel-based learning techniques under bounded noiseCode0
Detecting Misclassification Errors in Neural Networks with a Gaussian Process ModelCode0
Variational Bayesian Multiple Instance Learning With Gaussian ProcessesCode0
Deep Variational Implicit ProcessesCode0
Deep Structured Mixtures of Gaussian ProcessesCode0
Considering discrepancy when calibrating a mechanistic electrophysiology modelCode0
Hodge-Compositional Edge Gaussian ProcessesCode0
How Bayesian Should Bayesian Optimisation Be?Code0
Adjusting Model Size in Continual Gaussian Processes: How Big is Big Enough?Code0
Taylorformer: Probabilistic Modelling for Random Processes including Time SeriesCode0
Adversarial Robustness Guarantees for Random Deep Neural NetworksCode0
Bayesian optimization of atomic structures with prior probabilities from universal interatomic potentialsCode0
Hybrid Parameter Search and Dynamic Model Selection for Mixed-Variable Bayesian OptimizationCode0
Hyperbolic Secant representation of the logistic function: Application to probabilistic Multiple Instance Learning for CT intracranial hemorrhage detectionCode0
On the Estimation of Derivatives Using Plug-in Kernel Ridge Regression EstimatorsCode0
HyperBO+: Pre-training a universal prior for Bayesian optimization with hierarchical Gaussian processesCode0
Hyper-parameter tuning of physics-informed neural networks: Application to Helmholtz problemsCode0
Reliable training and estimation of variance networksCode0
Non-Euclidean Universal ApproximationCode0
Conditionally Independent Multiresolution Gaussian ProcessesCode0
Identifying Sources and Sinks in the Presence of Multiple Agents with Gaussian Process Vector CalculusCode0
Identifying stochastic oscillations in single-cell live imaging time series using Gaussian processesCode0
Deep Random Splines for Point Process Intensity Estimation of Neural Population DataCode0
Implementation and Analysis of GPU Algorithms for Vecchia ApproximationCode0
Deep Kernels with Probabilistic Embeddings for Small-Data LearningCode0
Implicit Posterior Variational Inference for Deep Gaussian ProcessesCode0
Active Learning with Weak Supervision for Gaussian ProcessesCode0
Unifying Probabilistic Models for Time-Frequency AnalysisCode0
Function-space Inference with Sparse Implicit ProcessesCode0
Improved uncertainty quantification for neural networks with Bayesian last layerCode0
Sparse Inducing Points in Deep Gaussian Processes: Enhancing Modeling with Denoising Diffusion Variational InferenceCode0
Deep neural networks with dependent weights: Gaussian Process mixture limit, heavy tails, sparsity and compressibilityCode0
Improving Linear System Solvers for Hyperparameter Optimisation in Iterative Gaussian ProcessesCode0
Nonlinear Inverse Reinforcement Learning with Gaussian ProcessesCode0
Deep Neural Networks as Gaussian ProcessesCode0
Residual Deep Gaussian Processes on ManifoldsCode0
Incorporating Prior Knowledge into Neural Networks through an Implicit Composite KernelCode0
A conditional one-output likelihood formulation for multitask Gaussian processesCode0
Incorporating Sum Constraints into Multitask Gaussian ProcessesCode0
Nonmyopic Global Optimisation via Approximate Dynamic ProgrammingCode0
Conditional Deep Gaussian Processes: empirical Bayes hyperdata learningCode0
Sparse Multi-Output Gaussian Processes for Medical Time Series PredictionCode0
Non-parametric Estimation of Stochastic Differential Equations with Sparse Gaussian ProcessesCode0
Indian Buffet process for model selection in convolved multiple-output Gaussian processesCode0
A Statistical Learning View of Simple KrigingCode0
Show:102550
← PrevPage 36 of 40Next →

Benchmark Results

#ModelMetricClaimedVerifiedStatus
1ICKy, periodicRoot mean square error (RMSE)0.03Unverified