Scalable Bayesian model averaging through local information propagation

From MaRDI portal
Publication:5367402

DOI10.1080/01621459.2014.980908zbMATH Open1419.62168arXiv1403.2397OpenAlexW1966543131MaRDI QIDQ5367402FDOQ5367402


Authors: Li Ma Edit this on Wikidata


Publication date: 13 October 2017

Published in: Journal of the American Statistical Association (Search for Journal in Brave)

Abstract: We show that a probabilistic version of the classical forward-stepwise variable inclusion procedure can serve as a general data-augmentation scheme for model space distributions in (generalized) linear models. This latent variable representation takes the form of a Markov process, thereby allowing information propagation algorithms to be applied for sampling from model space posteriors. In particular, we propose a sequential Monte Carlo method for achieving effective unbiased Bayesian model averaging in high-dimensional problems, utilizing proposal distributions constructed using local information propagation. We illustrate our method---called LIPS for local information propagation based sampling---through real and simulated examples with dimensionality ranging from 15 to 1,000, and compare its performance in estimating posterior inclusion probabilities and in out-of-sample prediction to those of several other methods---namely, MCMC, BAS, iBMA, and LASSO. In addition, we show that the latent variable representation can also serve as a modeling tool for specifying model space priors that account for knowledge regarding model complexity and conditional inclusion relationships.


Full work available at URL: https://arxiv.org/abs/1403.2397




Recommendations





Cited In (2)





This page was built for publication: Scalable Bayesian model averaging through local information propagation

Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q5367402)