Convergence of conditional Metropolis-Hastings samplers
DOI10.1239/AAP/1401369701zbMATH Open1379.60082OpenAlexW2139831589MaRDI QIDQ5169501FDOQ5169501
Authors: Galin L. Jones, Gareth O. Roberts, Jeffrey S. Rosenthal
Publication date: 10 July 2014
Published in: Advances in Applied Probability (Search for Journal in Brave)
Full work available at URL: https://projecteuclid.org/euclid.aap/1401369701
Recommendations
- On the convergence of the Metropolis-Hastings Markov chains
- Simple conditions for the convergence of the Gibbs sampler and Metropolis-Hastings algorithms
- Adaptive Gibbs samplers and related MCMC methods
- Two convergence properties of hybrid samplers
- Minorization Conditions and Convergence Rates for Markov Chain Monte Carlo
Gibbs samplerconvergence rategeometric ergodicityMarkov chain Monte Carlo algorithmindependence sampler
Computational methods in Markov chains (60J22) Bayesian inference (62F15) Monte Carlo methods (65C05) Discrete-time Markov processes on general state spaces (60J05)
Cites Work
- Markov chains for exploring posterior distributions. (With discussion)
- Covariance structure of the Gibbs sampler with applications to the comparisons of estimators and augmentation schemes
- Markov chain Monte Carlo: can we trust the third significant figure?
- Sampling-Based Approaches to Calculating Marginal Densities
- Handbook of Markov Chain Monte Carlo
- Title not available (Why is that?)
- A note on Metropolis-Hastings kernels for general state spaces
- Geometric ergodicity of Metropolis algorithms
- Geometric convergence and central limit theorems for multidimensional Hastings and Metropolis algorithms
- Markov chains and stochastic stability
- General state space Markov chains and MCMC algorithms
- On the Markov chain central limit theorem
- Geometric ergodicity of Gibbs and block Gibbs samplers for a hierarchical random effects model
- Geometric ergodicity and hybrid Markov chains
- Sufficient burn-in for Gibbs samplers for a hierarchical random effects model.
- Rates of convergence of the Hastings and Metropolis algorithms
- Gibbs sampling for a Bayesian hierarchical general linear model
- Stability of the Gibbs sampler for Bayesian hierarchical models
- Fixed-Width Output Analysis for Markov Chain Monte Carlo
- Minorization Conditions and Convergence Rates for Markov Chain Monte Carlo
- On inference for partially observed nonlinear diffusion models using the Metropolis-Hastings algorithm
- Title not available (Why is that?)
- Likelihood Inference for Discretely Observed Nonlinear Diffusions
- Comparison theorems for reversible Markov chains
- Honest exploration of intractable probability distributions via Markov chain Monte Carlo.
- Convergence control methods for Markov chain Monte Carlo algorithms
- Optimum Monte-Carlo sampling using Markov chains
- Bounds on the L 2 Spectrum for Markov Chains and Markov Processes: A Generalization of Cheeger's Inequality
- Improved Bounds for Mixing Rates of Markov Chains and Multicommodity Flow
- Markov chains and de-initializing processes
- Convergence of Slice Sampler Markov Chains
- Title not available (Why is that?)
- Geometric Ergodicity of van Dyk and Meng's Algorithm for the Multivariate Student'stModel
- Component-wise Markov chain Monte Carlo: uniform and geometric ergodicity under mixing and composition
- Quantitative non-geometric convergence bounds for independence samplers
- Two convergence properties of hybrid samplers
Cited In (15)
- Uniform ergodicity of the iterated conditional SMC and geometric ergodicity of particle Gibbs samplers
- Bayesian variable selection in a finite mixture of linear mixed-effects models
- A modified conditional Metropolis-Hastings sampler
- A hybrid scan Gibbs sampler for Bayesian models with latent variables
- Geometric ergodicity of Gibbs samplers for the horseshoe and its regularized variants
- Convergence rates of two-component MCMC samplers
- Convergence of the equi-energy sampler
- Improving efficiency of data augmentation algorithms using Peskun's theorem
- Convergence rates of Metropolis-Hastings algorithms
- Geometric ergodicity of a more efficient conditional Metropolis-Hastings algorithm
- Geometric ergodicity of random scan Gibbs samplers for hierarchical one-way random effects models
- Component-wise Markov chain Monte Carlo: uniform and geometric ergodicity under mixing and composition
- Assessing and Visualizing Simultaneous Simulation Error
- On convergence of the iterative conditional estimations
- Fast Monte Carlo Markov chains for Bayesian shrinkage models with random effects
This page was built for publication: Convergence of conditional Metropolis-Hastings samplers
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q5169501)