Convergence analysis of the Gibbs sampler for Bayesian general linear mixed models with improper priors
From MaRDI portal
Publication:741808
DOI10.1214/12-AOS1052zbMATH Open1296.60204arXiv1111.3210MaRDI QIDQ741808FDOQ741808
Authors: Jorge Carlos Román, James P. Hobert
Publication date: 15 September 2014
Published in: The Annals of Statistics (Search for Journal in Brave)
Abstract: Bayesian analysis of data from the general linear mixed model is challenging because any nontrivial prior leads to an intractable posterior density. However, if a conditionally conjugate prior density is adopted, then there is a simple Gibbs sampler that can be employed to explore the posterior density. A popular default among the conditionally conjugate priors is an improper prior that takes a product form with a flat prior on the regression parameter, and so-called power priors on each of the variance components. In this paper, a convergence rate analysis of the corresponding Gibbs sampler is undertaken. The main result is a simple, easily-checked sufficient condition for geometric ergodicity of the Gibbs-Markov chain. This result is close to the best possible result in the sense that the sufficient condition is only slightly stronger than what is required to ensure posterior propriety. The theory developed in this paper is extremely important from a practical standpoint because it guarantees the existence of central limit theorems that allow for the computation of valid asymptotic standard errors for the estimates computed using the Gibbs sampler.
Full work available at URL: https://arxiv.org/abs/1111.3210
Recommendations
- Geometric ergodicity of Gibbs samplers for Bayesian general linear mixed models with proper priors
- Convergence analysis of block Gibbs samplers for Bayesian linear mixed models with \(p>N\)
- Gibbs sampling for a Bayesian hierarchical general linear model
- Convergence analysis of the block Gibbs sampler for Bayesian probit linear mixed models with improper priors
- The Effect of Improper Priors on Gibbs Sampling in Hierarchical Linear Mixed Models
Markov chainMonte Carloconvergence rategeometric ergodicitygeometric drift conditionposterior propriety
Cites Work
- Covariance structure of the Gibbs sampler with applications to the comparisons of estimators and augmentation schemes
- Title not available (Why is that?)
- Markov chain Monte Carlo: can we trust the third significant figure?
- Prior distributions for variance parameters in hierarchical models (Comment on article by Browne and Draper)
- Markov chains and stochastic stability
- General state space Markov chains and MCMC algorithms
- Gibbs sampling for a Bayesian hierarchical general linear model
- Batch means and spectral variance estimators in Markov chain Monte Carlo
- Stability of the Gibbs sampler for Bayesian hierarchical models
- Fixed-Width Output Analysis for Markov Chain Monte Carlo
- Markov-chain monte carlo: Some practical implications of theoretical results
- Propriety of posteriors with improper priors in hierachical linear mixed models.
- A spectral analytic comparison of trace-class data augmentation algorithms and their sandwich variants
- Markov chains and de-initializing processes
- Gibbs sampling, exponential families and orthogonal polynomials
- Convergence analysis of the Gibbs sampler for Bayesian general linear mixed models with improper priors
- A prior for the variance in hierarchical models
Cited In (18)
- Convergence analysis of the block Gibbs sampler for Bayesian probit linear mixed models with improper priors
- Block Gibbs samplers for logistic mixed models: convergence properties and a comparison with full Gibbs samplers
- Dimension free convergence rates for Gibbs samplers for Bayesian linear mixed models
- Geometric convergence bounds for Markov chains in Wasserstein distance based on generalized drift and contraction conditions
- Gibbs sampling for a Bayesian hierarchical general linear model
- Stability of the Gibbs sampler for Bayesian hierarchical models
- On the propriety of the posterior of hierarchical linear mixed models with flexible random effects distributions
- Large-Sample Joint Posterior Approximations When Full Conditionals Are Approximately Normal
- Bayesian regression analysis of data with random effects covariates from nonlinear longitudinal measurements
- Convergence analysis of block Gibbs samplers for Bayesian linear mixed models with \(p>N\)
- Component-wise Markov chain Monte Carlo: uniform and geometric ergodicity under mixing and composition
- Convergence analysis of the Gibbs sampler for Bayesian general linear mixed models with improper priors
- Wasserstein-based methods for convergence complexity analysis of MCMC with applications
- Geometric ergodicity for Bayesian shrinkage models
- Fast Monte Carlo Markov chains for Bayesian shrinkage models with random effects
- Density regression and uncertainty quantification with Bayesian deep noise neural networks
- Geometric ergodicity of Gibbs samplers for Bayesian general linear mixed models with proper priors
- On reparametrization and the Gibbs sampler
This page was built for publication: Convergence analysis of the Gibbs sampler for Bayesian general linear mixed models with improper priors
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q741808)