A general framework for the parametrization of hierarchical models
From MaRDI portal
Publication:449750
DOI10.1214/088342307000000014zbMATH Open1246.62195arXiv0708.3797OpenAlexW1997628954MaRDI QIDQ449750FDOQ449750
Authors: Omiros Papaspiliopoulos, Martin Sköld, Gareth O. Roberts
Publication date: 1 September 2012
Published in: Statistical Science (Search for Journal in Brave)
Abstract: In this paper, we describe centering and noncentering methodology as complementary techniques for use in parametrization of broad classes of hierarchical models, with a view to the construction of effective MCMC algorithms for exploring posterior distributions from these models. We give a clear qualitative understanding as to when centering and noncentering work well, and introduce theory concerning the convergence time complexity of Gibbs samplers using centered and noncentered parametrizations. We give general recipes for the construction of noncentered parametrizations, including an auxiliary variable technique called the state-space expansion technique. We also describe partially noncentered methods, and demonstrate their use in constructing robust Gibbs sampler algorithms whose convergence properties are not overly sensitive to the data.
Full work available at URL: https://arxiv.org/abs/0708.3797
Recommendations
- scientific article; zbMATH DE number 946661
- A model search procedure for hierarchical models
- Hierarchical models as marginals of hierarchical models
- Semi-parametric marginal models for hierarchical data and their corresponding full models
- Asymptotic Estimates of Hierarchical Modeling
- Hierarchical estimation of parameters in Bayesian networks
- On nonparametric Bayesian hierarchical modelling
Bayesian inference (62F15) Inference from stochastic processes (62M99) Numerical analysis or methods applied to Markov chains (65C40)
Cites Work
- Fast sampling of Gaussian Markov random fields
- Bayesian Prediction of Spatial Count Data Using Generalized Linear Mixed Models
- Bayesian curve fitting using multivariate normal mixtures
- Model-Based Geostatistics
- Log Gaussian Cox Processes
- Title not available (Why is that?)
- Partial non-Gaussian state space
- On Gibbs sampling for state space models
- Fitting Gaussian Markov Random Fields to Gaussian Fields
- Title not available (Why is that?)
- Bayesian Inference for Non-Gaussian Ornstein–Uhlenbeck Stochastic Volatility Processes
- Efficient parametrisations for normal linear mixed models
- Prior distributions on spaces of probability measures
- Estimating Normal Means with a Dirichlet Process Prior
- Title not available (Why is that?)
- Exact and Computationally Efficient Likelihood-Based Estimation for Discretely Observed Diffusion Processes (with Discussion)
- Stability of the Gibbs sampler for Bayesian hierarchical models
- On Block Updating in Markov Random Field Models for Disease Mapping
- Markov chain Monte Carlo in approximate Dirichlet and beta two-parameter process hierarchical models
- Title not available (Why is that?)
- Retrospective Markov chain Monte Carlo methods for Dirichlet process hierarchical models
- Bayesian Nonparametric Inference for Random Distributions and Related Functions
- Semiparametric Bayesian inference for stochastic frontier models
- On inference for partially observed nonlinear diffusion models using the Metropolis-Hastings algorithm
- The Limiting Distribution of the Serial Correlation Coefficient in the Explosive Case
- Non-Gaussian State-Space Modeling of Nonstationary Time Series
- Likelihood analysis of a first‐order autoregressive model with exponential innovations
- Honest exploration of intractable probability distributions via Markov chain Monte Carlo.
- On rates of convergence of stochastic relaxation for Gaussian and non- Gaussian distributions
- Statistical inference in a two-compartment model for hematopoiesis
- A Bayesian method for classification and discrimination
- Can Markov chain Monte Carlo be usefully applied to stochastic processes with hidden birth times?
- Title not available (Why is that?)
- Gibbs Sampling
Cited In (69)
- Conditionally structured variational Gaussian approximation with importance weights
- Variational inference for generalized linear mixed models using partially noncentered parametrizations
- A conversation with Alan Gelfand
- Improving the convergence properties of the data augmentation algorithm with an application to Bayesian mixture modeling
- Hyperpriors for Matérn fields with applications in Bayesian inversion
- Accelerating parallel tempering: Quantile tempering algorithm (QuanTA)
- Hyperparameter estimation in Bayesian MAP estimation: parameterizations and consistency
- Specification and identification issues in models involving a latent hierarchical structure
- Sequential Bayesian inference for implicit hidden Markov models and current limitations
- A non-parametric Bayesian approach to decompounding from high frequency data
- A stochastic variational framework for fitting and diagnosing generalized linear mixed models
- Coupling stochastic EM and approximate Bayesian computation for parameter inference in state-space models
- Time-varying sparsity in dynamic regression models
- Particle methods for stochastic differential equation mixed effects models
- Stochastic epidemic models inference and diagnosis with Poisson random measure data augmentation
- Approximate Bayesian inference for latent Gaussian models by using integrated nested Laplace approximations (with discussion)
- Fast Sampling in a Linear-Gaussian Inverse Problem
- Locally adaptive smoothing with Markov random fields and shrinkage priors
- Scalable Bayesian computation for crossed and nested hierarchical models
- An Exact Auxiliary Variable Gibbs Sampler for a Class of Diffusions
- Development of a novel computational model for the balloon analogue risk task: the exponential-weight mean-variance model
- Reconciling Bayesian and Perimeter Regularization for Binary Inversion
- Bayesian computation: a summary of the current state, and samples backwards and forwards
- Probabilistic prediction of neurological disorders with a statistical assessment of neuroimaging data modalities
- Adaptive multiple importance sampling for Gaussian processes
- Title not available (Why is that?)
- Latent diffusion models for survival analysis
- Normal approximation for hierarchical structures
- Using hierarchical centering to facilitate a reversible jump MCMC algorithm for random effects models
- Variable transformation to obtain geometric ergodicity in the random-walk Metropolis algorithm
- Data transforming augmentation for heteroscedastic models
- How Deep Are Deep Gaussian Processes?
- Achieving shrinkage in a time-varying parameter model framework
- Non-stationary multi-layered Gaussian priors for Bayesian inversion
- Title not available (Why is that?)
- Structured hierarchical models for probabilistic inference from perturbation screening data
- Title not available (Why is that?)
- Parameterizations for ensemble Kalman inversion
- Efficient Parameter Sampling for Markov Jump Processes
- Low-Rank Independence Samplers in Hierarchical Bayesian Inverse Problems
- Markov Chain Monte Carlo for Exact Inference for Diffusions
- Modelling multi-output stochastic frontiers using copulas
- Fitting stochastic epidemic models to gene genealogies using linear noise approximation
- Title not available (Why is that?)
- Multilevel structured additive regression
- Selecting the precision parameter prior in Dirichlet process mixture models
- A model search procedure for hierarchical models
- Comparison and assessment of epidemic models
- Sampling hyperparameters in hierarchical models: Improving on Gibbs for high-dimensional latent fields and large datasets
- Geometric ergodicity of Gibbs samplers for Bayesian general linear mixed models with proper priors
- On reparametrization and the Gibbs sampler
- Metropolized Randomized Maximum Likelihood for Improved Sampling from Multimodal Distributions
- Adaptive inference over Besov spaces in the white noise model using \(p\)-exponential priors
- Bayesian Conditional Transformation Models
- A Bayesian multilevel model for populations of networks using exponential-family random graphs
- Nested sampling methods
- Efficient data augmentation techniques for some classes of state space models
- Dimension-free mixing times of Gibbs samplers for Bayesian hierarchical models
- Learning variational autoencoders via MCMC speed measures
- Exact Bayesian Inference for Diffusion-Driven Cox Processes
- Parameter estimation with increased precision for elliptic and hypo-elliptic diffusions
- Bayesian prediction of jumps in large panels of time series data
- Hybrid iterative ensemble smoother for history matching of hierarchical models
- Nonparametric Posterior Learning for Emission Tomography
- Multilevel linear models, Gibbs samplers and multigrid decompositions (with discussion)
- Sampling algorithms in statistical physics: a guide for statistics and machine learning
- Efficient inference of generalized spatial fusion models with flexible specification
- Informative Bayesian neural network priors for weak signals
- Non-centered parametric variational Bayes’ approach for hierarchical inverse problems of partial differential equations
This page was built for publication: A general framework for the parametrization of hierarchical models
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q449750)