SOTAVerified

Derivation of Output Correlation Inferences for Multi-Output (aka Multi-Task) Gaussian Process

2025-01-14Unverified0· sign in to hype

Shuhei Watanabe

Unverified — Be the first to reproduce this paper.

Reproduce

Abstract

Gaussian process (GP) is arguably one of the most widely used machine learning algorithms in practice. One of its prominent applications is Bayesian optimization (BO). Although the vanilla GP itself is already a powerful tool for BO, it is often beneficial to be able to consider the dependencies of multiple outputs. To do so, Multi-task GP (MTGP) is formulated, but it is not trivial to fully understand the derivations of its formulations and their gradients from the previous literature. This paper serves friendly derivations of the MTGP formulations and their gradients.

Tasks

Reproductions