MCMC-driven importance samplers
From MaRDI portal
Publication:2110113
DOI10.1016/J.APM.2022.06.027zbMATH Open1505.62468arXiv2105.02579OpenAlexW3157699234MaRDI QIDQ2110113FDOQ2110113
David Delgado, Víctor Elvira, Luca Martino, E. Curbelo, Fernando Rodriguez Llorente
Publication date: 21 December 2022
Published in: Applied Mathematical Modelling (Search for Journal in Brave)
Abstract: Monte Carlo sampling methods are the standard procedure for approximating complicated integrals of multidimensional posterior distributions in Bayesian inference. In this work, we focus on the class of Layered Adaptive Importance Sampling (LAIS) scheme, which is a family of adaptive importance samplers where Markov chain Monte Carlo algorithms are employed to drive an underlying multiple importance sampling scheme. The modular nature of LAIS allows for different possible implementations, yielding a variety of different performance and computational costs. In this work, we propose different enhancements of the classical LAIS setting in order to increase the efficiency and reduce the computational cost, of both upper and lower layers. The different variants address computational challenges arising in real-world applications, for instance with highly concentrated posterior distributions. Furthermore, we introduce different strategies for designing cheaper schemes, for instance, recycling samples generated in the upper layer and using them in the final estimators in the lower layer. Different numerical experiments, considering several challenging scenarios, show the benefits of the proposed schemes comparing with benchmark methods presented in the literature.
Full work available at URL: https://arxiv.org/abs/2105.02579
Cites Work
- Title not available (Why is that?)
- Title not available (Why is that?)
- Title not available (Why is that?)
- Title not available (Why is that?)
- Sequential Monte Carlo Samplers
- Adaptive Rejection Sampling for Gibbs Sampling
- Adaptive rejection Metropolis sampling using Lagrange interpolation polynomials of degree 2
- Adaptive Rejection Metropolis Sampling within Gibbs Sampling
- Adaptive Multiple Importance Sampling
- Advanced Markov Chain Monte Carlo Methods
- Convergence rates for optimised adaptive importance samplers
- Layered adaptive importance sampling
- Generalized multiple importance sampling
- An Adaptive Population Importance Sampler: Learning From Uncertainty
- Compressed Monte Carlo with application in particle filtering
- On a Metropolis-Hastings importance sampling estimator
- Marginal Likelihood Computation for Model Selection and Hypothesis Testing: An Extensive Review
Cited In (11)
- Unconstrained recursive importance sampling
- Title not available (Why is that?)
- Generalized Poststratification and Importance Sampling for Subsampled Markov Chain Monte Carlo Estimation
- Generalized integral transform and Hamiltonian Monte Carlo for Bayesian structural damage identification
- Policy Gradient Importance Sampling for Bayesian Inference
- Coupling importance sampling and multilevel Monte Carlo using sample average approximation
- Importance-Weighted Marginal Bayesian Posterior Density Estimation
- Iterative importance sampling with Markov chain Monte Carlo sampling in robust Bayesian analysis
- Gibbs sampler by sampling-importance-resampling
- Importance Sampling-Based Transport Map Hamiltonian Monte Carlo for Bayesian Hierarchical Models
- Langevin incremental mixture importance sampling
This page was built for publication: MCMC-driven importance samplers
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2110113)