Subsampling sequential Monte Carlo for static Bayesian models
From MaRDI portal
Abstract: We show how to speed up Sequential Monte Carlo (SMC) for Bayesian inference in large data problems by data subsampling. SMC sequentially updates a cloud of particles through a sequence of distributions, beginning with a distribution that is easy to sample from such as the prior and ending with the posterior distribution. Each update of the particle cloud consists of three steps: reweighting, resampling, and moving. In the move step, each particle is moved using a Markov kernel; this is typically the most computationally expensive part, particularly when the dataset is large. It is crucial to have an efficient move step to ensure particle diversity. Our article makes two important contributions. First, in order to speed up the SMC computation, we use an approximately unbiased and efficient annealed likelihood estimator based on data subsampling. The subsampling approach is more memory efficient than the corresponding full data SMC, which is an advantage for parallel computation. Second, we use a Metropolis within Gibbs kernel with two conditional updates. A Hamiltonian Monte Carlo update makes distant moves for the model parameters, and a block pseudo-marginal proposal is used for the particles corresponding to the auxiliary variables for the data subsampling. We demonstrate both the usefulness and limitations of the methodology for estimating four generalized linear models and a generalized additive model with large datasets.
Recommendations
Cites work
- scientific article; zbMATH DE number 1666084 (Why is no real title available?)
- scientific article; zbMATH DE number 6781368 (Why is no real title available?)
- scientific article; zbMATH DE number 3189754 (Why is no real title available?)
- A sequential particle filter method for static models
- An adaptive sequential Monte Carlo method for approximate Bayesian computation
- An adaptive sequential Monte Carlo sampler
- Bayes Factors
- Component-wise Markov chain Monte Carlo: uniform and geometric ergodicity under mixing and composition
- Hamiltonian Monte Carlo with energy conserving subsampling
- Handbook of Markov Chain Monte Carlo
- Inference for Lévy-driven stochastic volatility models via adaptive sequential Monte Carlo
- Langevin diffusions and Metropolis-Hastings algorithms
- Marginal Likelihood From the Metropolis–Hastings Output
- Monte Carlo strategies in scientific computing
- On the convergence of adaptive sequential Monte Carlo methods
- Parallelizing particle filters with butterfly interactions
- Sequential Monte Carlo Samplers
- Sequential Monte Carlo samplers with independent Markov chain Monte Carlo proposals
- Speeding Up MCMC by Efficient Data Subsampling
- Subsampling MCMC -- an introduction for the survey statistician
- The correlated pseudomarginal method
- Weak convergence and optimal scaling of random walk Metropolis algorithms
Cited in
(17)- A one-pass sequential Monte Carlo method for Bayesian analysis of massive datasets
- Subsampling MCMC -- an introduction for the survey statistician
- Generalized Poststratification and Importance Sampling for Subsampled Markov Chain Monte Carlo Estimation
- Informed sub-sampling MCMC: approximate Bayesian inference for large datasets
- Merging MCMC subposteriors through Gaussian-process approximations
- Hamiltonian Monte Carlo with energy conserving subsampling
- Sequential tests for large-scale learning
- Sub-sample swapping for sequential Monte Carlo approximation of high-dimensional densities in the context of complex object tracking
- Selection sampling from large data sets for targeted inference in mixture modeling
- Most likely optimal subsampled Markov chain Monte Carlo
- Accelerating sequential Monte Carlo with surrogate likelihoods
- Efficient Sequential Monte-Carlo Samplers for Bayesian Inference
- Using Approximate Bayesian Computation by Subset Simulation for Efficient Posterior Assessment of Dynamic State-Space Model Classes
- An Approach to Incorporate Subsampling Into a Generic Bayesian Hierarchical Model
- Speeding Up MCMC by Efficient Data Subsampling
- Subsampling the Gibbs sampler: variance reduction
- Distributed computation for marginal likelihood based model choice
This page was built for publication: Subsampling sequential Monte Carlo for static Bayesian models
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2209734)