Convergence of Conditional Metropolis-Hastings Samplers
From MaRDI portal
Publication:5169501
DOI10.1239/aap/1401369701zbMath1379.60082OpenAlexW2139831589MaRDI QIDQ5169501
Gareth O. Roberts, Jeffrey S. Rosenthal, Galin L. Jones
Publication date: 10 July 2014
Published in: Advances in Applied Probability (Search for Journal in Brave)
Full work available at URL: https://projecteuclid.org/euclid.aap/1401369701
convergence rateGibbs samplergeometric ergodicityMarkov chain Monte Carlo algorithmindependence sampler
Computational methods in Markov chains (60J22) Bayesian inference (62F15) Monte Carlo methods (65C05) Discrete-time Markov processes on general state spaces (60J05)
Related Items
Geometric ergodicity of Gibbs samplers for the horseshoe and its regularized variants, Convergence rates of two-component MCMC samplers, Assessing and Visualizing Simultaneous Simulation Error, Geometric ergodicity of a more efficient conditional Metropolis-Hastings algorithm, Bayesian variable selection in a finite mixture of linear mixed-effects models, Geometric ergodicity of random scan Gibbs samplers for hierarchical one-way random effects models, Fast Monte Carlo Markov chains for Bayesian shrinkage models with random effects, Component-wise Markov chain Monte Carlo: uniform and geometric ergodicity under mixing and composition, Improving efficiency of data augmentation algorithms using Peskun's theorem, A hybrid scan Gibbs sampler for Bayesian models with latent variables
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Quantitative non-geometric convergence bounds for independence samplers
- Markov chains and stochastic stability
- Markov chain Monte Carlo: can we trust the third significant figure?
- General state space Markov chains and MCMC algorithms
- On the Markov chain central limit theorem
- Geometric ergodicity of Gibbs and block Gibbs samplers for a hierarchical random effects model
- A note on Metropolis-Hastings kernels for general state spaces
- Two convergence properties of hybrid samplers
- Comparison theorems for reversible Markov chains
- Geometric ergodicity and hybrid Markov chains
- Honest exploration of intractable probability distributions via Markov chain Monte Carlo.
- Convergence control methods for Markov chain Monte Carlo algorithms
- Geometric ergodicity of Metropolis algorithms
- Sufficient burn-in for Gibbs samplers for a hierarchical random effects model.
- Markov chains for exploring posterior distributions. (With discussion)
- Rates of convergence of the Hastings and Metropolis algorithms
- Gibbs sampling for a Bayesian hierarchical general linear model
- Stability of the Gibbs sampler for Bayesian hierarchical models
- Markov Chains and De-initializing Processes
- On inference for partially observed nonlinear diffusion models using the Metropolis-Hastings algorithm
- Bounds on the L 2 Spectrum for Markov Chains and Markov Processes: A Generalization of Cheeger's Inequality
- Optimum Monte-Carlo sampling using Markov chains
- Fixed-Width Output Analysis for Markov Chain Monte Carlo
- Sampling-Based Approaches to Calculating Marginal Densities
- Handbook of Markov Chain Monte Carlo
- Geometric convergence and central limit theorems for multidimensional Hastings and Metropolis algorithms
- Convergence of Slice Sampler Markov Chains
- Improved Bounds for Mixing Rates of Markov Chains and Multicommodity Flow
- Covariance structure of the Gibbs sampler with applications to the comparisons of estimators and augmentation schemes
- Likelihood Inference for Discretely Observed Nonlinear Diffusions
- Minorization Conditions and Convergence Rates for Markov Chain Monte Carlo
- Geometric Ergodicity of van Dyk and Meng's Algorithm for the Multivariate Student'stModel
- Component-wise Markov chain Monte Carlo: uniform and geometric ergodicity under mixing and composition