Sequential Bayesian inference for implicit hidden Markov models and current limitations
From MaRDI portal
Publication:2786524
DOI10.1051/proc/201551002zbMath1348.60106arXiv1505.04321OpenAlexW2964170009MaRDI QIDQ2786524
Publication date: 15 February 2016
Published in: ESAIM: Proceedings and Surveys (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1505.04321
Time series, auto-correlation, regression, etc. in statistics (GARCH) (62M10) Bayesian inference (62F15) Monte Carlo methods (65C05) Discrete-time Markov processes on general state spaces (60J05)
Related Items (3)
Smoothing With Couplings of Conditional Particle Filters ⋮ Unnamed Item ⋮ Efficient \(\mathrm{SMC}^2\) schemes for stochastic kinetic models
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Can local particle filters beat the curse of dimensionality?
- Particle approximations of the score and observed information matrix in state space models with application to parameter estimation
- Establishing some order amongst exact approximations of MCMCs
- Stability properties of some particle filters
- A general framework for the parametrization of hierarchical models
- Optimal filtering and the dual process
- Stability of Feynman-Kac formulae with path-dependent potentials
- Approximate Bayesian computational methods
- A nonasymptotic theorem for unnormalized Feynman-Kac particle models
- Time series analysis via mechanistic models
- Bayesian model averaging: A tutorial. (with comments and a rejoinder).
- Biased online parameter inference for state-space models
- Central limit theorem for sequential Monte Carlo methods and its application to Bayesian inference
- Inference in hidden Markov models.
- Efficient learning via simulation: a marginalized resample-move approach
- On the stability of sequential Monte Carlo methods in high dimensions
- On particle Gibbs sampling
- Following a Moving Target—Monte Carlo Inference for Dynamic Bayesian Models
- Sequential Monte Carlo Methods in Practice
- Bridging the ensemble Kalman and particle filters
- Uniform Ergodicity of the Particle Gibbs Sampler
- A sequential particle filter method for static models
- Bayesian Inference for Linear Dynamic Models With Dirichlet Process Mixtures
- High-throughput scalable parallel resampling mechanism for effective redistribution of particles
- Sequential Monte Carlo Samplers: Error Bounds and Insensitivity to Initial Conditions
- Efficient implementation of Markov chain Monte Carlo when using an unbiased likelihood estimator
- The Bayesian Choice
- Resampling algorithms and architectures for distributed particle filters
- On Disturbance State-Space Models and the Particle Marginal Metropolis-Hastings Sampler
- Simulating normalizing constants: From importance sampling to bridge sampling to path sampling
- On parallel implementation of sequential Monte Carlo methods: the island particle model
- On the role of interaction in sequential Monte Carlo algorithms
This page was built for publication: Sequential Bayesian inference for implicit hidden Markov models and current limitations