Group SLOPE – Adaptive Selection of Groups of Predictors
From MaRDI portal
Publication:5229924
Abstract: Sorted L-One Penalized Estimation (SLOPE) is a relatively new convex optimization procedure which allows for adaptive selection of regressors under sparse high dimensional designs. Here we extend the idea of SLOPE to deal with the situation when one aims at selecting whole groups of explanatory variables instead of single regressors. Such groups can be formed by clustering strongly correlated predictors or groups of dummy variables corresponding to different levels of the same qualitative predictor. We formulate the respective convex optimization problem, gSLOPE (group SLOPE), and propose an efficient algorithm for its solution. We also define a notion of the group false discovery rate (gFDR) and provide a choice of the sequence of tuning parameters for gSLOPE so that gFDR is provably controlled at a prespecified level if the groups of variables are orthogonal to each other. Moreover, we prove that the resulting procedure adapts to unknown sparsity and is asymptotically minimax with respect to the estimation of the proportions of variance of the response variable explained by regressors from different groups. We also provide a method for the choice of the regularizing sequence when variables in different groups are not orthogonal but statistically independent and illustrate its good properties with computer simulations. Finally, we illustrate the advantages of gSLOPE in the context of Genome Wide Association Studies. R package grpSLOPE with implementation of our method is available on CRAN.
Recommendations
Cites work
- scientific article; zbMATH DE number 720689 (Why is no real title available?)
- scientific article; zbMATH DE number 845714 (Why is no real title available?)
- A Fast Iterative Shrinkage-Thresholding Algorithm for Linear Inverse Problems
- A new look at the statistical model identification
- Adapting to unknown sparsity by controlling the false discovery rate
- Asymptotic Analysis of Complex LASSO via Complex Approximate Message Passing (CAMP)
- Asymptotic Bayes-optimality under sparsity of some multiple testing procedures
- Block-Sparse Recovery via Convex Optimization
- Bounds on the prediction error of penalized least squares estimators with convex penalty
- Controlling the false discovery rate via knockoffs
- Estimating the dimension of a model
- False discoveries occur early on the Lasso path
- Model Selection and Estimation in Regression with Grouped Variables
- On false discovery rate thresholding for classification under sparsity
- Probing the Pareto frontier for basis pursuit solutions
- SLOPE is adaptive to unknown sparsity and asymptotically minimax
- SLOPE-adaptive variable selection via convex optimization
- Some optimality properties of FDR controlling rules under sparsity
- Sparse Optimization with Least-Squares Constraints
- Standardization and the group lasso penalty
Cited in
(7)- SLOPE-adaptive variable selection via convex optimization
- Adaptive Bayesian SLOPE: Model Selection With Incomplete Data
- Sparse index clones via the sorted \(\ell_1\)-norm
- Low-rank tensor regression for selection of grouped variables
- Safe Rules for the Identification of Zeros in the Solutions of the SLOPE Problem
- On the asymptotic properties of SLOPE
- Characterizing the SLOPE trade-off: a variational perspective and the Donoho-Tanner limit
This page was built for publication: Group SLOPE – Adaptive Selection of Groups of Predictors
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q5229924)