Adaptive sequential Monte Carlo by means of mixture of experts

From MaRDI portal
Publication:892475

DOI10.1007/S11222-012-9372-2zbMATH Open1325.62151arXiv1108.2836OpenAlexW1978093868MaRDI QIDQ892475FDOQ892475


Authors: Julien Cornebise, Eric Moulines, Jimmy Olsson Edit this on Wikidata


Publication date: 19 November 2015

Published in: Statistics and Computing (Search for Journal in Brave)

Abstract: Appropriately designing the proposal kernel of particle filters is an issue of significant importance, since a bad choice may lead to deterioration of the particle sample and, consequently, waste of computational power. In this paper we introduce a novel algorithm adaptively approximating the so-called optimal proposal kernel by a mixture of integrated curved exponential distributions with logistic weights. This family of distributions, referred to as mixtures of experts, is broad enough to be used in the presence of multi-modality or strongly skewed distributions. The mixtures are fitted, via online-EM methods, to the optimal kernel through minimisation of the Kullback-Leibler divergence between the auxiliary target and instrumental distributions of the particle filter. At each iteration of the particle filter, the algorithm is required to solve only a single optimisation problem for the whole particle sample, yielding an algorithm with only linear complexity. In addition, we illustrate in a simulation study how the method can be successfully applied to optimal filtering in nonlinear state-space models.


Full work available at URL: https://arxiv.org/abs/1108.2836




Recommendations




Cites Work


Cited In (4)





This page was built for publication: Adaptive sequential Monte Carlo by means of mixture of experts

Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q892475)