Convergence analysis of block Gibbs samplers for Bayesian linear mixed models with p>N
From MaRDI portal
Publication:502882
DOI10.3150/15-BEJ749zbMATH Open1368.62061arXiv1502.05460OpenAlexW2101142126MaRDI QIDQ502882FDOQ502882
Authors: Tavis Abrahamsen, James P. Hobert
Publication date: 11 January 2017
Published in: Bernoulli (Search for Journal in Brave)
Abstract: Exploration of the intractable posterior distributions associated with Bayesian versions of the general linear mixed model is often performed using Markov chain Monte Carlo. In particular, if a conditionally conjugate prior is used, then there is a simple two-block Gibbs sampler available. Rom'{a}n and Hobert [Linear Algebra Appl. 473 (2015) 54-77] showed that, when the priors are proper and the matrix has full column rank, the Markov chains underlying these Gibbs samplers are nearly always geometrically ergodic. In this paper, Rom'{a}n and Hobert's (2015) result is extended by allowing improper priors on the variance components, and, more importantly, by removing all assumptions on the matrix. So, not only is allowed to be (column) rank deficient, which provides additional flexibility in parameterizing the fixed effects, it is also allowed to have more columns than rows, which is necessary in the increasingly important situation where . The full rank assumption on is at the heart of Rom'{a}n and Hobert's (2015) proof. Consequently, the extension to unrestricted requires a substantially different analysis.
Full work available at URL: https://arxiv.org/abs/1502.05460
Recommendations
- Convergence analysis of the block Gibbs sampler for Bayesian probit linear mixed models with improper priors
- Convergence analysis of the Gibbs sampler for Bayesian general linear mixed models with improper priors
- Block Gibbs samplers for logistic mixed models: convergence properties and a comparison with full Gibbs samplers
- Dimension free convergence rates for Gibbs samplers for Bayesian linear mixed models
- Analysis of the Pólya-gamma block Gibbs sampler for Bayesian logistic linear mixed models
- On the convergence complexity of Gibbs samplers for a family of simple Bayesian random effects models
- A blocked Gibbs sampler for NGG-mixture models via a priori truncation
- scientific article; zbMATH DE number 833423
- On the convergence rate of the ``out-of-order block Gibbs sampler
Computational methods in Markov chains (60J22) Bayesian inference (62F15) Linear regression; mixed models (62J05)
Cited In (7)
- Convergence analysis of a collapsed Gibbs sampler for Bayesian vector autoregressions
- Block Gibbs samplers for logistic mixed models: convergence properties and a comparison with full Gibbs samplers
- Dimension free convergence rates for Gibbs samplers for Bayesian linear mixed models
- On the convergence rate of the ``out-of-order block Gibbs sampler
- Convergence analysis of the Gibbs sampler for Bayesian general linear mixed models with improper priors
- Fast Monte Carlo Markov chains for Bayesian shrinkage models with random effects
- Geometric ergodicity of Gibbs samplers for Bayesian general linear mixed models with proper priors
This page was built for publication: Convergence analysis of block Gibbs samplers for Bayesian linear mixed models with \(p>N\)
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q502882)