Bayesian Bootstrap Spike-and-Slab LASSO
From MaRDI portal
Abstract: The impracticality of posterior sampling has prevented the widespread adoption of spike-and-slab priors in high-dimensional applications. To alleviate the computational burden, optimization strategies have been proposed that quickly find local posterior modes. Trading off uncertainty quantification for computational speed, these strategies have enabled spike-and-slab deployments at scales that would be previously unfeasible. We build on one recent development in this strand of work: the Spike-and-Slab LASSO procedure of Rov{c}kov'{a} and George (2018). Instead of optimization, however, we explore multiple avenues for posterior sampling, some traditional and some new. Intrigued by the speed of Spike-and-Slab LASSO mode detection, we explore the possibility of sampling from an approximate posterior by performing MAP optimization on many independently perturbed datasets. To this end, we explore Bayesian bootstrap ideas and introduce a new class of jittered Spike-and-Slab LASSO priors with random shrinkage targets. These priors are a key constituent of the Bayesian Bootstrap Spike-and-Slab LASSO (BB-SSL) method proposed here. BB-SSL turns fast optimization into approximate posterior sampling. Beyond its scalability, we show that BB-SSL has a strong theoretical support. Indeed, we find that the induced pseudo-posteriors contract around the truth at a near-optimal rate in sparse normal-means and in high-dimensional regression. We compare our algorithm to the traditional Stochastic Search Variable Selection (under Laplace priors) as well as many state-of-the-art methods for shrinkage priors. We show, both in simulations and on real data, that our method fares superbly in these comparisons, often providing substantial computational gains.
Cites work
- scientific article; zbMATH DE number 3753890 (Why is no real title available?)
- scientific article; zbMATH DE number 509150 (Why is no real title available?)
- scientific article; zbMATH DE number 1034042 (Why is no real title available?)
- A Useful Convergence Theorem for Probability Distributions
- A general theory of concave regularization for high-dimensional sparse estimation problems
- Bayesian Variable Selection in Linear Regression
- Bayesian estimation of sparse signals with a continuous spike-and-slab prior
- Bayesian inference and the parametric bootstrap
- Bayesian lasso regression
- Bayesian model selection in high-dimensional settings
- Dirichlet-Laplace priors for optimal shrinkage
- EMVS: the EM approach to Bayesian variable selection
- Evolutionary stochastic search for Bayesian model exploration
- Exchangeably weighted bootstraps of the general empirical process
- Large deviations of the maximum eigenvalue in Wishart random matrices
- Model Selection and Accounting for Model Uncertainty in Graphical Models Using Occam's Window
- Neuronized Priors for Bayesian Sparse Linear Regression
- Particle EM for variable selection
- Scalable approximate MCMC algorithms for the horseshoe prior
- Scalable variational inference for Bayesian variable selection in regression, and its accuracy in genetic association studies
- Simultaneous Variable and Covariance Selection With the Multivariate Spike-and-Slab LASSO
- Skinny Gibbs: a consistent and scalable Gibbs sampler for model selection
- Spike and Slab Gene Selection for Multigroup Microarray Data
- Spike-and-Slab Group Lassos for Grouped Regression and Sparse Generalized Additive Models
- The Bayesian Lasso
- The horseshoe estimator for sparse signals
- The spike-and-slab LASSO
- Variance prior forms for high-dimensional Bayesian variable selection
- Weighted Bayesian bootstrap for scalable posterior distributions
Cited in
(4)
This page was built for publication: Bayesian Bootstrap Spike-and-Slab LASSO
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q127195)