SOTAVerified

Can Sequential Bayesian Inference Solve Continual Learning?

2021-11-22pproximateinference AABI Symposium 2022Unverified0· sign in to hype

Samuel Kessler, Adam D. Cobb, Stefan Zohren, Stephen J. Roberts

Unverified — Be the first to reproduce this paper.

Reproduce

Abstract

Previous work in Continual Learning (CL) has used sequential Bayesian inference to prevent forgetting and accumulate knowledge from previous tasks. A limiting factor to performing Bayesian CL has been exact inference in a Bayesian Neural Network (NN). We perform sequential Bayesian inference with a Bayesian NN using Hamiltonian Monte Carlo (HMC) and propagate the posterior as a prior for a new task by fitting a density estimator on HMC samples. We find that this approach fails to prevent forgetting. We propose an alternative view of the CL problem which directly models the data generating process and decomposes the CL problem in task specific and shared parameters. This method named Prototypical Bayesian CL and performs well compared to the latest Bayesian CL methods.

Tasks

Reproductions