How to avoid the curse of dimensionality: scalability of particle filters with and without importance weights
From MaRDI portal
Publication:4621283
Abstract: Particle filters are a popular and flexible class of numerical algorithms to solve a large class of nonlinear filtering problems. However, standard particle filters with importance weights have been shown to require a sample size that increases exponentially with the dimension D of the state space in order to achieve a certain performance, which precludes their use in very high-dimensional filtering problems. Here, we focus on the dynamic aspect of this curse of dimensionality (COD) in continuous time filtering, which is caused by the degeneracy of importance weights over time. We show that the degeneracy occurs on a time-scale that decreases with increasing D. In order to soften the effects of weight degeneracy, most particle filters use particle resampling and improved proposal functions for the particle motion. We explain why neither of the two can prevent the COD in general. In order to address this fundamental problem, we investigate an existing filtering algorithm based on optimal feedback control that sidesteps the use of importance weights. We use numerical experiments to show that this Feedback Particle Filter (FPF) by Yang et al. (2013) does not exhibit a COD.
Recommendations
- Curse-of-dimensionality revisited: Collapse of the particle filter in very large scale systems
- Can local particle filters beat the curse of dimensionality?
- A stable particle filter for a class of high-dimensional state-space models
- Multivariable feedback particle filter
- A unification of weighted and unweighted particle filters
Cites work
- scientific article; zbMATH DE number 5919872 (Why is no real title available?)
- scientific article; zbMATH DE number 5871504 (Why is no real title available?)
- A survey of convergence results on particle filtering methods for practitioners
- Approximate McKean-Vlasov representations for a class of SPDEs
- Central limit theorem for sequential Monte Carlo methods and its application to Bayesian inference
- Curse-of-dimensionality revisited: Collapse of the particle filter in very large scale systems
- Feedback Particle Filter
- Fundamentals of stochastic filtering
- Multivariable feedback particle filter
- Particle approximations for a class of stochastic partial differential equations
- Particle filters
Cited in
(18)- Projected data assimilation using sliding window proper orthogonal decomposition
- Curse-of-dimensionality revisited: Collapse of the particle filter in very large scale systems
- Autoregressive Point Processes as Latent State-Space Models: A Moment-Closure Approach to Fluctuations and Autocorrelations
- High Order Deep Neural Network for Solving High Frequency Partial Differential Equations
- Multilevel ensemble Kalman-Bucy filters
- The Hitchhiker's guide to nonlinear filtering
- Can local particle filters beat the curse of dimensionality?
- Controlled interacting particle algorithms for simulation-based reinforcement learning
- Clustered exact Daum-Huang particle flow filter
- Diffusion map-based algorithm for gain function approximation in the feedback particle filter
- A unification of weighted and unweighted particle filters
- Feedback particle filter for collective inference
- Convergence of Regularized Particle Filters for Stochastic Reaction Networks
- A stable particle filter for a class of high-dimensional state-space models
- Multivariate feedback particle filter rederived from the splitting-up scheme
- Model and data reduction for data assimilation: particle filters employing projected forecasts and data with application to a shallow water model
- McKean--Vlasov SDEs in Nonlinear Filtering
- Bellman filtering and smoothing for state-space models
This page was built for publication: How to avoid the curse of dimensionality: scalability of particle filters with and without importance weights
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q4621283)