Convergence analysis of a collapsed Gibbs sampler for Bayesian vector autoregressions
From MaRDI portal
Publication:2044318
DOI10.1214/21-EJS1800zbMath1476.62056arXiv1907.03170OpenAlexW3124264335MaRDI QIDQ2044318
Karl Oskar Ekvall, Galin L. Jones
Publication date: 9 August 2021
Published in: Electronic Journal of Statistics (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1907.03170
Markov chain Monte CarloGibbs samplergeometric ergodicityBayesian vector autoregressionconvergence complexity analysis
Time series, auto-correlation, regression, etc. in statistics (GARCH) (62M10) Bayesian inference (62F15) Markov processes: estimation; hidden Markov models (62M05)
Related Items (5)
Convergence rates of two-component MCMC samplers ⋮ Assessing and Visualizing Simultaneous Simulation Error ⋮ Exact convergence analysis for metropolis–hastings independence samplers in Wasserstein distances ⋮ On the limitations of single-step drift and minorization in Markov chain convergence analysis ⋮ On the convergence complexity of Gibbs samplers for a family of simple Bayesian random effects models
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Markov chain Monte Carlo estimation of quantiles
- Geometric ergodicity of random scan Gibbs samplers for hierarchical one-way random effects models
- Convergence analysis of block Gibbs samplers for Bayesian linear mixed models with \(p>N\)
- On the computational complexity of high-dimensional Bayesian variable selection
- Markov chains and stochastic stability
- Markov chain Monte Carlo: can we trust the third significant figure?
- On the Markov chain central limit theorem
- Some interlacing properties of the Schur complement of a Hermitian matrix
- Honest exploration of intractable probability distributions via Markov chain Monte Carlo.
- Convergence control methods for Markov chain Monte Carlo algorithms
- Geometric ergodicity of Metropolis algorithms
- Strong consistency of multivariate spectral variance estimators in Markov chain Monte Carlo
- Sufficient burn-in for Gibbs samplers for a hierarchical random effects model.
- Convergence complexity analysis of Albert and Chib's algorithm for Bayesian probit regression
- Batch means and spectral variance estimators in Markov chain Monte Carlo
- Geometric ergodicity of Gibbs samplers in Bayesian penalized regression models
- Nonasymptotic bounds on the estimation error of MCMC algorithms
- Markov Chains and De-initializing Processes
- Fixed-Width Output Analysis for Markov Chain Monte Carlo
- Forecasting in vector autoregressions with many predictors
- Geometric convergence and central limit theorems for multidimensional Hastings and Metropolis algorithms
- The Collapsed Gibbs Sampler in Bayesian Computations with Applications to a Gene Regulation Problem
- Convergence Analysis of MCMC Algorithms for Bayesian Multivariate Linear Regression with Non‐Gaussian Errors
- Minorization Conditions and Convergence Rates for Markov Chain Monte Carlo
- Multivariate output analysis for Markov chain Monte Carlo
- High-Dimensional Posterior Consistency in Bayesian Vector Autoregressive Models
- MCMC for Imbalanced Categorical Data
- An Introduction to Matrix Concentration Inequalities
This page was built for publication: Convergence analysis of a collapsed Gibbs sampler for Bayesian vector autoregressions