SOTAVerified

Gaussian Processes

Gaussian Processes is a powerful framework for several machine learning tasks such as regression, classification and inference. Given a finite set of input output training data that is generated out of a fixed (but possibly unknown) function, the framework models the unknown function as a stochastic process such that the training outputs are a finite number of jointly Gaussian random variables, whose properties can then be used to infer the statistics (the mean and variance) of the function at test values of input.

Source: Sequential Randomized Matrix Factorization for Gaussian Processes: Efficient Predictions and Hyper-parameter Optimization

Papers

Showing 17761800 of 1963 papers

TitleStatusHype
Identifying Sources and Sinks in the Presence of Multiple Agents with Gaussian Process Vector CalculusCode0
Identifying stochastic oscillations in single-cell live imaging time series using Gaussian processesCode0
Deep Random Splines for Point Process Intensity Estimation of Neural Population DataCode0
Implementation and Analysis of GPU Algorithms for Vecchia ApproximationCode0
Deep Kernels with Probabilistic Embeddings for Small-Data LearningCode0
Implicit Posterior Variational Inference for Deep Gaussian ProcessesCode0
Active Learning with Weak Supervision for Gaussian ProcessesCode0
Unifying Probabilistic Models for Time-Frequency AnalysisCode0
Function-space Inference with Sparse Implicit ProcessesCode0
Improved uncertainty quantification for neural networks with Bayesian last layerCode0
Sparse Inducing Points in Deep Gaussian Processes: Enhancing Modeling with Denoising Diffusion Variational InferenceCode0
Deep neural networks with dependent weights: Gaussian Process mixture limit, heavy tails, sparsity and compressibilityCode0
Improving Linear System Solvers for Hyperparameter Optimisation in Iterative Gaussian ProcessesCode0
Nonlinear Inverse Reinforcement Learning with Gaussian ProcessesCode0
Deep Neural Networks as Gaussian ProcessesCode0
Residual Deep Gaussian Processes on ManifoldsCode0
Incorporating Prior Knowledge into Neural Networks through an Implicit Composite KernelCode0
A conditional one-output likelihood formulation for multitask Gaussian processesCode0
Incorporating Sum Constraints into Multitask Gaussian ProcessesCode0
Nonmyopic Global Optimisation via Approximate Dynamic ProgrammingCode0
Conditional Deep Gaussian Processes: empirical Bayes hyperdata learningCode0
Sparse Multi-Output Gaussian Processes for Medical Time Series PredictionCode0
Non-parametric Estimation of Stochastic Differential Equations with Sparse Gaussian ProcessesCode0
Indian Buffet process for model selection in convolved multiple-output Gaussian processesCode0
A Statistical Learning View of Simple KrigingCode0
Show:102550
← PrevPage 72 of 79Next →

Benchmark Results

#ModelMetricClaimedVerifiedStatus
1ICKy, periodicRoot mean square error (RMSE)0.03Unverified