Gibbs sampling for a Bayesian hierarchical general linear model
From MaRDI portal
Publication:1952054
Abstract: We consider a Bayesian hierarchical version of the normal theory general linear model which is practically relevant in the sense that it is general enough to have many applications and it is not straightforward to sample directly from the corresponding posterior distribution. Thus we study a block Gibbs sampler that has the posterior as its invariant distribution. In particular, we establish that the Gibbs sampler converges at a geometric rate. This allows us to establish conditions for a central limit theorem for the ergodic averages used to estimate features of the posterior. Geometric ergodicity is also a key component for using batch means methods to consistently estimate the variance of the asymptotic normal distribution. Together, our results give practitioners the tools to be as confident in inferences based on the observations from the Gibbs sampler as they would be with inferences based on random samples from the posterior. Our theoretical results are illustrated with an application to data on the cost of health plans issued by health maintenance organizations.
Recommendations
- Stability of the Gibbs sampler for Bayesian hierarchical models
- scientific article; zbMATH DE number 549940
- Geometric ergodicity of Gibbs samplers for Bayesian general linear mixed models with proper priors
- Geometric ergodicity of Gibbs and block Gibbs samplers for a hierarchical random effects model
- Convergence analysis of the Gibbs sampler for Bayesian general linear mixed models with improper priors
Cites work
- scientific article; zbMATH DE number 1911984 (Why is no real title available?)
- Batch means and spectral variance estimators in Markov chain Monte Carlo
- Fixed-Width Output Analysis for Markov Chain Monte Carlo
- Geometric ergodicity of Gibbs and block Gibbs samplers for a hierarchical random effects model
- Honest exploration of intractable probability distributions via Markov chain Monte Carlo.
- Markov chain Monte Carlo: can we trust the third significant figure?
- Markov chains and de-initializing processes
- Markov chains and stochastic stability
- Markov chains for exploring posterior distributions. (With discussion)
- On Deriving the Inverse of a Sum of Matrices
- On the applicability of regenerative simulation in Markov chain Monte Carlo
- Rates of convergence for Gibbs sampling for variance component models
- Regeneration in Markov Chain Samplers
- Some Algebra and Geometry for Hierarchical Models, Applied to Diagnostics
- Stability of the Gibbs sampler for Bayesian hierarchical models
- Sufficient burn-in for Gibbs samplers for a hierarchical random effects model.
- The asymptotic validity of sequential stopping rules for stochastic simulations
- Using a Markov Chain to Construct a Tractable Approximation of an Intractable Probability Distribution
Cited in
(28)- Convergence rate of Markov chain methods for genomic motif discovery
- A Bayesian approach to nonlinear latent variable models using the Gibbs sampler and the Metropolis-Hastings algorithm
- An Approach to Incorporate Subsampling Into a Generic Bayesian Hierarchical Model
- The Efficiency of Next-Generation Gibbs-Type Samplers: An Illustration Using a Hierarchical Model in Cosmology
- scientific article; zbMATH DE number 6948153 (Why is no real title available?)
- Component-wise Markov chain Monte Carlo: uniform and geometric ergodicity under mixing and composition
- Bayesian regression analysis of data with random effects covariates from nonlinear longitudinal measurements
- On computation using Gibbs sampling for multilevel models
- Gibbs sampling using the data augmentation scheme for higher-order item response models
- Stability of the Gibbs sampler for Bayesian hierarchical models
- Exponential concentration inequalities for additive functionals of Markov chains
- Convergence analysis of the Gibbs sampler for Bayesian general linear mixed models with improper priors
- Geometric ergodicity of Gibbs samplers for Bayesian general linear mixed models with proper priors
- scientific article; zbMATH DE number 833423 (Why is no real title available?)
- Variable transformation to obtain geometric ergodicity in the random-walk Metropolis algorithm
- Convergence analysis of the block Gibbs sampler for Bayesian probit linear mixed models with improper priors
- Analysis of the Gibbs Sampler for Hierarchical Inverse Problems
- Nested sampling for general Bayesian computation
- Applicability of subsampling bootstrap methods in Markov chain Monte Carlo
- Partially Collapsed Gibbs Sampling for Linear Mixed-effects Models
- Nonasymptotic bounds on the estimation error of MCMC algorithms
- Exact bayesian inference for normal hierarchical models
- Convergence of conditional Metropolis-Hastings samplers
- Nonasymptotic bounds on the mean square error for MCMC estimates via renewal techniques
- Geometric ergodicity and scanning strategies for two-component Gibbs samplers
- Rigorous confidence bounds for MCMC under a geometric drift condition
- Bayesian Inference for Generalized Linear and Proportional Hazards Models via Gibbs Sampling
- Bayesian Analysis of Linear and Non-Linear Population Models by Using the Gibbs Sampler
This page was built for publication: Gibbs sampling for a Bayesian hierarchical general linear model
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q1952054)