SOTAVerified

Sensitivity analysis in HMMs with application to likelihood maximization

2009-12-01NeurIPS 2009Unverified0· sign in to hype

Pierre-Arnaud Coquelin, Romain Deguest, Rémi Munos

Unverified — Be the first to reproduce this paper.

Reproduce

Abstract

This paper considers a sensitivity analysis in Hidden Markov Models with continuous state and observation spaces. We propose an Infinitesimal Perturbation Analysis (IPA) on the filtering distribution with respect to some parameters of the model. We describe a methodology for using any algorithm that estimates the filtering density, such as Sequential Monte Carlo methods, to design an algorithm that estimates its gradient. The resulting IPA estimator is proven to be asymptotically unbiased, consistent and has computational complexity linear in the number of particles. We consider an application of this analysis to the problem of identifying unknown parameters of the model given a sequence of observations. We derive an IPA estimator for the gradient of the log-likelihood, which may be used in a gradient method for the purpose of likelihood maximization. We illustrate the method with several numerical experiments.

Tasks

Reproductions