Biased online parameter inference for state-space models
From MaRDI portal
Abstract: We consider Bayesian online static parameter estimation for state-space models. This is a very important problem, but is very computationally challenging as the state- of-the art methods that are exact, often have a computational cost that grows with the time parameter; perhaps the most successful algorithm is that of SMC2 [9]. We present a version of the SMC2 algorithm which has computational cost that does not grow with the time parameter. In addition, under assumptions, the algorithm is shown to provide consistent estimates of expectations w.r.t. the posterior. However, the cost to achieve this consistency can be exponential in the dimension of the parameter space; if this exponential cost is avoided, typically the algorithm is biased. The bias is investigated from a theoretical perspective and, under assumptions, we find that the bias does not accumulate as the time parameter grows. The algorithm is implemented on several Bayesian statistical models.
Recommendations
- Sequential Bayesian inference for static parameters in dynamic state space models
- \(\mathrm{SMC}^2\): an efficient algorithm for sequential analysis of state space models
- Nested particle filters for online parameter estimation in discrete-time state-space Markov models
- Bias of particle approximations to optimal filter derivative
- Uniform convergence over time of a nested particle filtering scheme for recursive parameter estimation in state-space Markov models
Cites work
- scientific article; zbMATH DE number 5919872 (Why is no real title available?)
- scientific article; zbMATH DE number 2106098 (Why is no real title available?)
- A Monte Carlo Approach to Filtering for a Class of Marked Doubly Stochastic Poisson Processes
- A nonasymptotic theorem for unnormalized Feynman-Kac particle models
- Consistency of the maximum likelihood estimator for general hidden Markov models
- Following a moving target -- Monte Carlo inference for dynamic Bayesian models
- Inference in hidden Markov models.
- Mean field simulation for Monte Carlo integration
- Nested particle filters for online parameter estimation in discrete-time state-space Markov models
- On particle methods for parameter estimation in state-space models
- On the convergence of adaptive sequential Monte Carlo methods
- Particle Markov Chain Monte Carlo Methods
- Particle-kernel estimation of the filter density in state-space models
- Path storage in the particle filter
- Practical Filtering with Sequential Parameter Learning
- Sequential Monte Carlo Samplers
- The Bernstein-Von Mises Theorem for Markov Processes
- The correlated pseudomarginal method
- Theory of segmented particle filters
- \(\mathrm{SMC}^2\): an efficient algorithm for sequential analysis of state space models
Cited in
(6)- Stochastic gradient MCMC for state space models
- Sequential Bayesian inference for implicit hidden Markov models and current limitations
- Uniform convergence over time of a nested particle filtering scheme for recursive parameter estimation in state-space Markov models
- Inference via low-dimensional couplings
- Bayesian Dynamic Feature Partitioning in High-Dimensional Regression With Big Data
- Sequential Bayesian inference for static parameters in dynamic state space models
This page was built for publication: Biased online parameter inference for state-space models
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q1707039)