Optimal scaling for partially updating MCMC algorithms
From MaRDI portal
Abstract: In this paper we shall consider optimal scaling problems for high-dimensional Metropolis--Hastings algorithms where updates can be chosen to be lower dimensional than the target density itself. We find that the optimal scaling rule for the Metropolis algorithm, which tunes the overall algorithm acceptance rate to be 0.234, holds for the so-called Metropolis-within-Gibbs algorithm as well. Furthermore, the optimal efficiency obtainable is independent of the dimensionality of the update rule. This has important implications for the MCMC practitioner since high-dimensional updates are generally computationally more demanding, so that lower-dimensional updates are therefore to be preferred. Similar results with rather different conclusions are given for so-called Langevin updates. In this case, it is found that high-dimensional updates are frequently most efficient, even taking into account computing costs.
Recommendations
- Optimal scaling for various Metropolis-Hastings algorithms.
- Optimal scaling of Metropolis algorithms: Heading toward general target distributions
- Optimal Scaling of Discrete Approximations to Langevin Diffusions
- Optimal scaling of MCMC beyond Metropolis
- Optimal scaling of the random walk Metropolis: general criteria for the 0.234 acceptance rule
Cites work
- scientific article; zbMATH DE number 3951715 (Why is no real title available?)
- scientific article; zbMATH DE number 4020069 (Why is no real title available?)
- scientific article; zbMATH DE number 48363 (Why is no real title available?)
- scientific article; zbMATH DE number 1085989 (Why is no real title available?)
- scientific article; zbMATH DE number 3274494 (Why is no real title available?)
- From Metropolis to diffusions: Gibbs states and optimal scaling.
- Langevin diffusions and Metropolis-Hastings algorithms
- Optimal Scaling of Discrete Approximations to Langevin Diffusions
- Optimal scaling for various Metropolis-Hastings algorithms.
- Weak convergence and optimal scaling of random walk Metropolis algorithms
Cited in
(25)- Statistical analysis of an endemic disease from a capture-recapture experiment
- Optimal acceptance rates for Metropolis algorithms: Moving beyond 0.234
- Optimal scaling of random walk Metropolis algorithms with discontinuous target densities
- Optimal scaling of random-walk Metropolis algorithms on general target distributions
- Optimal scaling of MCMC beyond Metropolis
- The random walk Metropolis: linking theory and practice through a case study
- Optimal scaling for random walk Metropolis on spherically constrained target densities
- Bayesian joint modeling of high-dimensional discrete multivariate longitudinal data using generalized linear mixed models
- Bayesian generalized linear low rank regression models for the detection of vaccine-adverse event associations
- Estimating and Projecting Trends in HIV/AIDS Generalized Epidemics Using Incremental Mixture Importance Sampling
- Hierarchical models: local proposal variances for RWM-within-Gibbs and MALA-within-Gibbs
- Complexity bounds for Markov chain Monte Carlo algorithms via diffusion limits
- Optimal scaling of random walk Metropolis algorithms with non-Gaussian proposals
- MALA-within-Gibbs samplers for high-dimensional distributions with sparse conditional structure
- Weak convergence and optimal tuning of the reversible jump algorithm
- Solution of the inverse scattering problem from inhomogeneous media using affine invariant sampling
- A Dirichlet form approach to MCMC optimal scaling
- Weak convergence of Metropolis algorithms for non-I.I.D. target distributions
- Optimal scalings for local Metropolis-Hastings chains on nonproduct targets in high dimensions
- Optimal scaling of the independence sampler: theory and practice
- Component-wise Markov chain Monte Carlo: uniform and geometric ergodicity under mixing and composition
- Hierarchical models and tuning of random walk Metropolis algorithms
- Mixing of MCMC algorithms
- Bayesian Variable Selections for Probit Models with Componentwise Gibbs Samplers
- Optimal scaling of Metropolis algorithms: Heading toward general target distributions
This page was built for publication: Optimal scaling for partially updating MCMC algorithms
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q997939)