Distributed Bayesian Inference in Linear Mixed-Effects Models
From MaRDI portal
Publication:5066445
DOI10.1080/10618600.2020.1869025OpenAlexW3122960922MaRDI QIDQ5066445
Yixiang Xu, Sanvesh Srivastava
Publication date: 29 March 2022
Published in: Journal of Computational and Graphical Statistics (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1080/10618600.2020.1869025
data augmentationWasserstein distanceWasserstein barycenterdivide-and-conquerMonte Carlo errorlocation-scatter family
Related Items (3)
Distributed Bayesian inference in massive spatial data ⋮ Divide-and-conquer Bayesian inference in hidden Markov models ⋮ Asynchronous and Distributed Data Augmentation for Massive Data Settings
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- A fixed-point approach to barycenters in Wasserstein space
- On high-dimensional misspecified mixed model analysis in genome-wide association study
- On the rate of convergence in Wasserstein distance of the empirical measure
- Flexible results for quadratic forms with applications to variance components estimation
- Efficient moment calculations for variance components in large unbalanced crossed random effects models
- Bayesian linear regression with sparse priors
- Split Hamiltonian Monte Carlo
- A stochastic variational framework for fitting and diagnosing generalized linear mixed models
- Subsampling MCMC -- an introduction for the survey statistician
- Double-parallel Monte Carlo for Bayesian analysis of big data
- Geometric ergodicity of Gibbs samplers for Bayesian general linear mixed models with proper priors
- Noisy Monte Carlo: convergence of Markov chains with approximate transition kernels
- Approximate Bayesian Inference for Latent Gaussian models by using Integrated Nested Laplace Approximations
- Barycenters in the Wasserstein Space
- Streamlined mean field variational Bayes for longitudinal and multilevel data analysis
- Geometry-Sensitive Ensemble Mean Based on Wasserstein Barycenters: Proof-of-Concept on Cloud Simulations
- Partially Collapsed Gibbs Samplers
- Seeking efficient data augmentation schemes via conditional and marginal augmentation
- Robust and Scalable Bayes via a Median of Subset Posterior Measures
- Scalable Bayes under Informative Sampling
- MCMC for Imbalanced Categorical Data
- Fast Moment-Based Estimation for Hierarchical Models
- Simple, scalable and accurate posterior interval estimation
This page was built for publication: Distributed Bayesian Inference in Linear Mixed-Effects Models