Error Bounds and Normalising Constants for Sequential Monte Carlo Samplers in High Dimensions
From MaRDI portal
Publication:5415104
DOI10.1239/aap/1396360114zbMath1291.65009OpenAlexW1901296138MaRDI QIDQ5415104
Alexandros Beskos, Ajay Jasra, Nick Whiteley, Dan Crisan
Publication date: 9 May 2014
Published in: Advances in Applied Probability (Search for Journal in Brave)
Full work available at URL: https://projecteuclid.org/euclid.aap/1396360114
Related Items
Sequential ensemble transform for Bayesian inverse problems ⋮ A Lagged Particle Filter for Stable Filtering of Certain High-Dimensional State-Space Models ⋮ Advanced Multilevel Monte Carlo Methods ⋮ An optimal control approach to particle filtering ⋮ Sequential estimation of temporally evolving latent space network models ⋮ Error bounds for sequential Monte Carlo samplers for multimodal distributions ⋮ A Particle Filter for Stochastic Advection by Lie Transport: A Case Study for the Damped and Forced Incompressible Two-Dimensional Euler Equation ⋮ Marginalized approximate filtering of state‐space models ⋮ Normalizing constants of log-concave densities ⋮ Bayesian model comparison with un-normalised likelihoods ⋮ Inference on high-dimensional implicit dynamic models using a guided intermediate resampling filter ⋮ Can local particle filters beat the curse of dimensionality? ⋮ On the stability of sequential Monte Carlo methods in high dimensions ⋮ Sequential Monte Carlo methods for Bayesian elliptic inverse problems ⋮ Gradient free parameter estimation for hidden Markov models with intractable likelihoods ⋮ Adaptive kernels in approximate filtering of state‐space models ⋮ A stable particle filter for a class of high-dimensional state-space models
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Can local particle filters beat the curse of dimensionality?
- On adaptive resampling strategies for sequential Monte Carlo methods
- Linear variance bounds for particle approximations of time-homogeneous Feynman-Kac formulae
- An adaptive sequential Monte Carlo method for approximate Bayesian computation
- A nonasymptotic theorem for unnormalized Feynman-Kac particle models
- Weak convergence and optimal scaling of random walk Metropolis algorithms
- Optimal scaling for various Metropolis-Hastings algorithms.
- Large deviations asymptotics and the spectral theory of multiplicatively regular Markov proces\-ses
- Optimal scaling of MaLa for nonlinear regression.
- Quantitative approximations of evolving probability measures and sequential Markov chain Monte Carlo methods
- Weak convergence of Metropolis algorithms for non-I.I.D. target distributions
- On the stability of sequential Monte Carlo methods in high dimensions
- Sequential Monte Carlo Methods for High-Dimensional Inverse Problems: A Case Study for the Navier--Stokes Equations
- Sequential Monte Carlo Samplers
- Curse-of-dimensionality revisited: Collapse of the particle filter in very large scale systems
- Optimal Scaling of Discrete Approximations to Langevin Diffusions
- A sequential particle filter method for static models
- Sequential Monte Carlo Samplers: Error Bounds and Insensitivity to Initial Conditions