Mode jumping MCMC for Bayesian variable selection in GLMM

From MaRDI portal
Publication:1663135

DOI10.1016/J.CSDA.2018.05.020zbMATH Open1469.62082arXiv1604.06398OpenAlexW2806191366MaRDI QIDQ1663135FDOQ1663135

Aliaksandr Hubin, Geir Storvik

Publication date: 21 August 2018

Published in: Computational Statistics and Data Analysis (Search for Journal in Brave)

Abstract: Generalized linear mixed models (GLMM) are used for inference and prediction in a wide range of different applications providing a powerful scientific tool. An increasing number of sources of data are becoming available, introducing a variety of candidate explanatory variables for these models. Selection of an optimal combination of variables is thus becoming crucial. In a Bayesian setting, the posterior distribution of the models, based on the observed data, can be viewed as a relevant measure for the model evidence. The number of possible models increases exponentially in the number of candidate variables. Moreover, the space of models has numerous local extrema in terms of posterior model probabilities. To resolve these issues a novel MCMC algorithm for the search through the model space via efficient mode jumping for GLMMs is introduced. The algorithm is based on that marginal likelihoods can be efficiently calculated within each model. It is recommended that either exact expressions or precise approximations of marginal likelihoods are applied. The suggested algorithm is applied to simulated data, the famous U.S. crime data, protein activity data and epigenetic data and is compared to several existing approaches.


Full work available at URL: https://arxiv.org/abs/1604.06398





Cites Work


Cited In (6)

Uses Software






This page was built for publication: Mode jumping MCMC for Bayesian variable selection in GLMM

Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q1663135)