How to avoid the curse of dimensionality: scalability of particle filters with and without importance weights
From MaRDI portal
Publication:4621283
DOI10.1137/17M1125340zbMATH Open1415.93268arXiv1703.07879WikidataQ128493460 ScholiaQ128493460MaRDI QIDQ4621283FDOQ4621283
Authors: Simone Carlo Surace, Anna Kutschireiter, Jean-Pascal Pfister
Publication date: 11 February 2019
Published in: SIAM Review (Search for Journal in Brave)
Abstract: Particle filters are a popular and flexible class of numerical algorithms to solve a large class of nonlinear filtering problems. However, standard particle filters with importance weights have been shown to require a sample size that increases exponentially with the dimension D of the state space in order to achieve a certain performance, which precludes their use in very high-dimensional filtering problems. Here, we focus on the dynamic aspect of this curse of dimensionality (COD) in continuous time filtering, which is caused by the degeneracy of importance weights over time. We show that the degeneracy occurs on a time-scale that decreases with increasing D. In order to soften the effects of weight degeneracy, most particle filters use particle resampling and improved proposal functions for the particle motion. We explain why neither of the two can prevent the COD in general. In order to address this fundamental problem, we investigate an existing filtering algorithm based on optimal feedback control that sidesteps the use of importance weights. We use numerical experiments to show that this Feedback Particle Filter (FPF) by Yang et al. (2013) does not exhibit a COD.
Full work available at URL: https://arxiv.org/abs/1703.07879
Recommendations
- Curse-of-dimensionality revisited: Collapse of the particle filter in very large scale systems
- Can local particle filters beat the curse of dimensionality?
- A stable particle filter for a class of high-dimensional state-space models
- Multivariable feedback particle filter
- A unification of weighted and unweighted particle filters
Cites Work
- Central limit theorem for sequential Monte Carlo methods and its application to Bayesian inference
- Title not available (Why is that?)
- Curse-of-dimensionality revisited: Collapse of the particle filter in very large scale systems
- Particle filters
- A survey of convergence results on particle filtering methods for practitioners
- Title not available (Why is that?)
- Fundamentals of stochastic filtering
- Multivariable feedback particle filter
- Feedback Particle Filter
- Particle approximations for a class of stochastic partial differential equations
- Approximate McKean-Vlasov representations for a class of SPDEs
Cited In (18)
- Multivariate feedback particle filter rederived from the splitting-up scheme
- Controlled interacting particle algorithms for simulation-based reinforcement learning
- A unification of weighted and unweighted particle filters
- Projected data assimilation using sliding window proper orthogonal decomposition
- McKean--Vlasov SDEs in Nonlinear Filtering
- Convergence of Regularized Particle Filters for Stochastic Reaction Networks
- Model and data reduction for data assimilation: particle filters employing projected forecasts and data with application to a shallow water model
- Curse-of-dimensionality revisited: Collapse of the particle filter in very large scale systems
- Bellman filtering and smoothing for state-space models
- Autoregressive Point Processes as Latent State-Space Models: A Moment-Closure Approach to Fluctuations and Autocorrelations
- The Hitchhiker's guide to nonlinear filtering
- Can local particle filters beat the curse of dimensionality?
- A stable particle filter for a class of high-dimensional state-space models
- Diffusion map-based algorithm for gain function approximation in the feedback particle filter
- High Order Deep Neural Network for Solving High Frequency Partial Differential Equations
- Clustered exact Daum-Huang particle flow filter
- Feedback particle filter for collective inference
- Multilevel ensemble Kalman-Bucy filters
This page was built for publication: How to avoid the curse of dimensionality: scalability of particle filters with and without importance weights
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q4621283)