Safe Rules for the Identification of Zeros in the Solutions of the SLOPE Problem
From MaRDI portal
Publication:5885837
DOI10.1137/21M1457631OpenAlexW3209141277MaRDI QIDQ5885837
Publication date: 30 March 2023
Published in: SIAM Journal on Mathematics of Data Science (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/2110.11784
Analysis of algorithms and problem complexity (68Q25) Computer graphics; computational geometry (digital and algorithmic aspects) (68U05)
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- SLOPE is adaptive to unknown sparsity and asymptotically minimax
- A mathematical introduction to compressive sensing
- SLOPE-adaptive variable selection via convex optimization
- Regularization and the small-ball method. I: Sparse recovery
- Optimization with Sparsity-Inducing Penalties
- Atomic Decomposition by Basis Pursuit
- Gap Safe screening rules for sparsity enforcing penalties
- Safe Feature Elimination in Sparse Supervised Learning
- Screening Rules and its Complexity for Active Set Identification
- Sparse index clones via the sorted ℓ1-Norm
- Safe Squeezing for Antisparse Coding
- Solving the OSCAR and SLOPE Models Using a Semismooth Newton-Based Augmented Lagrangian Method
- Group SLOPE – Adaptive Selection of Groups of Predictors
- Gather and Conquer: Region-Based Strategies to Accelerate Safe Screening Tests
- Simultaneous Regression Shrinkage, Variable Selection, and Supervised Clustering of Predictors with OSCAR
- The Alternating Descent Conditional Gradient Method for Sparse Inverse Problems
- Strong Rules for Discarding Predictors in Lasso-Type Problems
- Convex analysis and monotone operator theory in Hilbert spaces
- Benchmarking optimization software with performance profiles.
This page was built for publication: Safe Rules for the Identification of Zeros in the Solutions of the SLOPE Problem