Convergence Analysis of MCMC Algorithms for Bayesian Multivariate Linear Regression with Non‐Gaussian Errors
DOI10.1111/SJOS.12310zbMATH Open1403.62132OpenAlexW2782752599MaRDI QIDQ4685441FDOQ4685441
Yeun Ji Jung, Kshitij Khare, James P. Hobert, Qian Qin
Publication date: 8 October 2018
Published in: Scandinavian Journal of Statistics (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1111/sjos.12310
drift conditionscale mixturegeometric ergodicityheavy-tailed distributiondata augmentation algorithmminorization conditionHaar PX-DA algorithm
Computational methods in Markov chains (60J22) Linear regression; mixed models (62J05) Estimation in multivariate analysis (62H12)
Cited In (4)
- Convergence analysis of a collapsed Gibbs sampler for Bayesian vector autoregressions
- A hybrid scan Gibbs sampler for Bayesian models with latent variables
- Convergence rates for MCMC algorithms for a robust Bayesian binary regression model
- Trace-class Monte Carlo Markov chains for Bayesian multivariate linear regression with non-Gaussian errors
This page was built for publication: Convergence Analysis of MCMC Algorithms for Bayesian Multivariate Linear Regression with Non‐Gaussian Errors
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q4685441)