Enhanced SMC^2: Leveraging Gradient Information from Differentiable Particle Filters Within Langevin Proposals
Conor Rosato, Joshua Murphy, Alessandro Varsi, Paul Horridge, Simon Maskell
Code Available — Be the first to reproduce this paper.
ReproduceCode
- github.com/j-j-murphy/smc-squared-langevinOfficialIn paperpytorch★ 0
Abstract
Sequential Monte Carlo Squared (SMC^2) is a Bayesian method which can infer the states and parameters of non-linear, non-Gaussian state-space models. The standard random-walk proposal in SMC^2 faces challenges, particularly with high-dimensional parameter spaces. This study outlines a novel approach by harnessing first-order gradients derived from a Common Random Numbers - Particle Filter (CRN-PF) using PyTorch. The resulting gradients can be leveraged within a Langevin proposal without accept/reject. Including Langevin dynamics within the proposal can result in a higher effective sample size and more accurate parameter estimates when compared with the random-walk. The resulting algorithm is parallelized on distributed memory using Message Passing Interface (MPI) and runs in O(_2N) time complexity. Utilizing 64 computational cores we obtain a 51x speed-up when compared to a single core. A GitHub link is given which provides access to the code.