Sufficient burn-in for Gibbs samplers for a hierarchical random effects model.

From MaRDI portal
Publication:1879955

DOI10.1214/009053604000000184zbMath1048.62069arXivmath/0406454OpenAlexW3099871107MaRDI QIDQ1879955

James P. Hobert, Galin L. Jones

Publication date: 15 September 2004

Published in: The Annals of Statistics (Search for Journal in Brave)

Full work available at URL: https://arxiv.org/abs/math/0406454



Related Items

A mixture representation of \(\pi\) with applications in Markov chain Monte Carlo and perfect sampling., Convergence rates of two-component MCMC samplers, A computational procedure for estimation of the mixing time of the random-scan Metropolis algorithm, Exploring stochasticity and imprecise knowledge based on linear inequality constraints, Batch means and spectral variance estimators in Markov chain Monte Carlo, Assessing and Visualizing Simultaneous Simulation Error, Complexity bounds for Markov chain Monte Carlo algorithms via diffusion limits, Convergence rates of attractive-repulsive MCMC algorithms, Nonasymptotic Bounds on the Mean Square Error for MCMC Estimates via Renewal Techniques, Rigorous confidence bounds for MCMC under a geometric drift condition, Reflections on Bayesian inference and Markov chain Monte Carlo, Complexity results for MCMC derived from quantitative bounds, Multilevel linear models, Gibbs samplers and multigrid decompositions (with discussion), Geometric ergodicity of a hybrid sampler for Bayesian inference of phylogenetic branch lengths, Gibbs sampling, exponential families and orthogonal polynomials, Comment: ``Gibbs sampling, exponential families, and orthogonal polynomials, Markov chain Monte Carlo: can we trust the third significant figure?, Optimal scaling of random-walk Metropolis algorithms on general target distributions, Numerical simulation of polynomial-speed convergence phenomenon, Gibbs sampling, conjugate priors and coupling, Nonasymptotic bounds on the estimation error of MCMC algorithms, Exact sampling for intractable probability distributions via a Bernoulli factory, Gibbs sampling for a Bayesian hierarchical general linear model, Convergence rate of Markov chain methods for genomic motif discovery, Convergence analysis of the block Gibbs sampler for Bayesian probit linear mixed models with improper priors, Geometric ergodicity for Bayesian shrinkage models, Variance bounding Markov chains, Markov chain Monte Carlo estimation of quantiles, Geometric ergodicity of random scan Gibbs samplers for hierarchical one-way random effects models, Accelerating a Gibbs sampler for variable selection on genomics data with summarization and variable pre-selection combining an array DBMS and R, On Monte Carlo methods for Bayesian multivariate regression models with heavy-tailed errors, Convergence of Conditional Metropolis-Hastings Samplers, Quantitative non-geometric convergence bounds for independence samplers, Variable transformation to obtain geometric ergodicity in the random-walk Metropolis algorithm, Exponential concentration inequalities for additive functionals of Markov chains, Geometric Ergodicity and Scanning Strategies for Two-Component Gibbs Samplers, Hitting time and convergence rate bounds for symmetric Langevin diffusions, Component-wise Markov chain Monte Carlo: uniform and geometric ergodicity under mixing and composition, Convergence analysis of a collapsed Gibbs sampler for Bayesian vector autoregressions, Using a Markov Chain to Construct a Tractable Approximation of an Intractable Probability Distribution, Geometric ergodicity of Gibbs samplers for Bayesian general linear mixed models with proper priors



Cites Work