A majorization-minimization approach to variable selection using spike and slab priors
From MaRDI portal
Publication:638812
DOI10.1214/11-AOS884zbMath1220.62065arXiv1005.0891MaRDI QIDQ638812
Publication date: 14 September 2011
Published in: The Annals of Statistics (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1005.0891
Estimation in multivariate analysis (62H12) Linear regression; mixed models (62J05) Bayesian inference (62F15) Numerical optimization and variational techniques (65K10)
Related Items
Combining a relaxed EM algorithm with Occam's razor for Bayesian variable selection in high-dimensional regression ⋮ An attention algorithm for solving large scale structured \(l_0\)-norm penalty estimation problems ⋮ Sparsity Constrained Estimation in Image Processing and Computer Vision ⋮ Structured variable selection via prior-induced hierarchical penalty functions ⋮ Neuronized Priors for Bayesian Sparse Linear Regression ⋮ A majorization-minimization approach to variable selection using spike and slab priors ⋮ Bayesian curve fitting and clustering with Dirichlet process mixture models for microarray data ⋮ A robust estimation for the extended \(t\)-process regression model ⋮ MM Algorithms for Variance Components Models ⋮ On the Convergence Rate of Inexact Majorized sGS ADMM with Indefinite Proximal Terms for Convex Composite Programming ⋮ Two-dimensional off-grid DOA estimation with improved three-parallel coprime arrays on moving platform
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Nearly unbiased variable selection under minimax concave penalty
- The Adaptive Lasso and Its Oracle Properties
- A majorization-minimization approach to variable selection using spike and slab priors
- Enhancing sparsity by reweighted \(\ell _{1}\) minimization
- One-step sparse estimates in nonconcave penalized likelihood models
- High-dimensional classification using features annealed independence rules
- Relaxed Lasso
- Asymptotics for Lasso-type estimators.
- Spike and slab variable selection: frequentist and Bayesian strategies
- Simultaneous analysis of Lasso and Dantzig selector
- On the adaptive elastic net with a diverging number of parameters
- Pathwise coordinate optimization
- The Dantzig selector: statistical estimation when \(p\) is much larger than \(n\). (With discussions and rejoinder).
- Coordinate descent algorithms for lasso penalized regression
- High-dimensional graphs and variable selection with the Lasso
- Empirical Bayes selection of wavelet thresholds
- The Bayesian elastic net
- Inference with normal-gamma prior distributions in regression problems
- Calibration and empirical Bayes variable selection
- SparseNet: Coordinate Descent With Nonconvex Penalties
- The Group Lasso for Logistic Regression
- Mixtures of g Priors for Bayesian Variable Selection
- The Bayesian Lasso
- Bayesian lasso regression
- Bayesian Variable Selection in Linear Regression
- Variable Selection via Nonconcave Penalized Likelihood and its Oracle Properties
- Flexible Empirical Bayes Estimation for Wavelets
- Regularization and Variable Selection Via the Elastic Net
- On the Non-Negative Garrotte Estimator
- Model Selection and Estimation in Regression with Grouped Variables
- Spike and Slab Gene Selection for Multigroup Microarray Data