Nonasymptotic Bounds on the Mean Square Error for MCMC Estimates via Renewal Techniques
DOI10.1007/978-3-642-27440-4_31zbMath1271.65007arXiv1101.5837OpenAlexW2113896743WikidataQ96748975 ScholiaQ96748975MaRDI QIDQ5326129
Krzysztof Łatuszyński, Wojciech Niemiro, Błażej Miasojedow
Publication date: 31 July 2013
Published in: Springer Proceedings in Mathematics & Statistics (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1101.5837
algorithmconfidence intervalsrenewal theorymean square errornonasymptotic boundssequential statisticsMarkov chain Monte Carlo trajectorysplit chain construction
Computational methods in Markov chains (60J22) Parametric tolerance and confidence regions (62F25) Monte Carlo methods (65C05) Numerical analysis or methods applied to Markov chains (65C40)
Related Items (4)
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Rigorous confidence bounds for MCMC under a geometric drift condition
- Markov chains and stochastic stability
- On Monte Carlo methods for Bayesian multivariate regression models with heavy-tailed errors
- General state space Markov chains and MCMC algorithms
- Explicit error bounds for lazy reversible Markov chain Monte Carlo
- On variance conditions for Markov chain CLTs
- Random generation of combinatorial structures from a uniform distribution
- Geometric ergodicity of Gibbs and block Gibbs samplers for a hierarchical random effects model
- Geometric ergodicity and hybrid Markov chains
- \(V\)-subgeometric ergodicity for a Hastings-Metropolis algorithm
- Catalytic perfect simulation
- Hoeffding's inequality for uniformly ergodic Markov chains
- Renewal theory and computable convergence rates for geometrically erdgodic Markov chains
- A mixture representation of \(\pi\) with applications in Markov chain Monte Carlo and perfect sampling.
- Sufficient burn-in for Gibbs samplers for a hierarchical random effects model.
- Rates of convergence of the Hastings and Metropolis algorithms
- Gibbs sampling for a Bayesian hierarchical general linear model
- Nonasymptotic bounds on the estimation error of MCMC algorithms
- A splitting technique for Harris recurrent Markov chains
- A New Approach to the Limit Theory of Recurrent Markov Chains
- On the geometric ergodicity of hybrid samplers
- On the applicability of regenerative simulation in Markov chain Monte Carlo
- MC's for MCMC'ists
- Regeneration in Markov Chain Samplers
- Minorization Conditions and Convergence Rates for Markov Chain Monte Carlo
- How to couple from the past using a read-once source of randomness
- Fixed Precision MCMC Estimation by Median of Products of Averages
- On the geometric ergodicity of Metropolis-Hastings algorithms
- On Excess Over the Boundary
- A regeneration proof of the central limit theorem for uniformly ergodic Markov chains
This page was built for publication: Nonasymptotic Bounds on the Mean Square Error for MCMC Estimates via Renewal Techniques