Group SLOPE – Adaptive Selection of Groups of Predictors
DOI10.1080/01621459.2017.1411269zbMATH Open1478.62200arXiv1610.04960OpenAlexW2963325939WikidataQ92886270 ScholiaQ92886270MaRDI QIDQ5229924FDOQ5229924
Authors: Damian Brzyski, Alexej Gossmann, Weijie Su, Małgorzata Bogdan
Publication date: 19 August 2019
Published in: Journal of the American Statistical Association (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1610.04960
Recommendations
Linear regression; mixed models (62J05) Ridge regression; shrinkage estimators (Lasso) (62J07) Paired and multiple comparisons; multiple testing (62J15)
Cites Work
- A Fast Iterative Shrinkage-Thresholding Algorithm for Linear Inverse Problems
- Sparse Optimization with Least-Squares Constraints
- Probing the Pareto frontier for basis pursuit solutions
- Estimating the dimension of a model
- Title not available (Why is that?)
- A new look at the statistical model identification
- Title not available (Why is that?)
- Model Selection and Estimation in Regression with Grouped Variables
- Adapting to unknown sparsity by controlling the false discovery rate
- False discoveries occur early on the Lasso path
- Controlling the false discovery rate via knockoffs
- Standardization and the group lasso penalty
- SLOPE-adaptive variable selection via convex optimization
- Asymptotic Bayes-optimality under sparsity of some multiple testing procedures
- SLOPE is adaptive to unknown sparsity and asymptotically minimax
- Block-Sparse Recovery via Convex Optimization
- On false discovery rate thresholding for classification under sparsity
- Some optimality properties of FDR controlling rules under sparsity
- Bounds on the prediction error of penalized least squares estimators with convex penalty
- Asymptotic Analysis of Complex LASSO via Complex Approximate Message Passing (CAMP)
Cited In (7)
- SLOPE-adaptive variable selection via convex optimization
- Sparse index clones via the sorted \(\ell_1\)-norm
- Low-rank tensor regression for selection of grouped variables
- Characterizing the SLOPE trade-off: a variational perspective and the Donoho-Tanner limit
- Safe Rules for the Identification of Zeros in the Solutions of the SLOPE Problem
- On the asymptotic properties of SLOPE
- Adaptive Bayesian SLOPE: Model Selection With Incomplete Data
Uses Software
This page was built for publication: Group SLOPE – Adaptive Selection of Groups of Predictors
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q5229924)