Strong Rules for Discarding Predictors in Lasso-Type Problems
From MaRDI portal
Publication:5743136
Abstract: We consider rules for discarding predictors in lasso regression and related problems, for computational efficiency. El Ghaoui et al (2010) propose "SAFE" rules that guarantee that a coefficient will be zero in the solution, based on the inner products of each predictor with the outcome. In this paper we propose strong rules that are not foolproof but rarely fail in practice. These can be complemented with simple checks of the Karush- Kuhn-Tucker (KKT) conditions to provide safe rules that offer substantial speed and space savings in a variety of statistical convex optimization problems.
Recommendations
- Strong consistency of Lasso estimators
- On LASSO for predictive regression
- On the prediction performance of the Lasso
- The predictive Lasso
- Regularizing LASSO: a consistent variable selection method
- Improving Lasso for model selection and prediction
- A note on the Lasso and related procedures in model selection
- Necessary and sufficient conditions for variable selection consistency of the Lasso in high dimensions
- On the sensitivity of the Lasso to the number of predictor variables
- Lasso penalized model selection criteria for high-dimensional multivariate linear regression analysis
Cited in
(66)- An attention algorithm for solving large scale structured \(l_0\)-norm penalty estimation problems
- Scaling up sparse support vector machines by simultaneous feature and sample reduction
- HiQR: an efficient algorithm for high-dimensional quadratic regression with penalties
- Fast best subset selection: coordinate descent and local combinatorial optimization algorithms
- Sequential safe feature elimination rule for \(L_1\)-regularized regression with Kullback-Leibler divergence
- Nonsmoothness in machine learning: specific structure, proximal identification, and applications
- Hybrid safe-strong rules for efficient optimization in Lasso-type problems
- Graphical models for zero-inflated single cell gene expression
- A Pliable Lasso
- scientific article; zbMATH DE number 7626763 (Why is no real title available?)
- Data shared Lasso: a novel tool to discover uplift
- Two-layer feature reduction for sparse-group Lasso via decomposition of convex sets
- On faster convergence of cyclic block coordinate descent-type methods for strongly convex minimization
- Structured variable selection via prior-induced hierarchical penalty functions
- Estimation of semiparametric regression model with right-censored high-dimensional data
- Adaptive hybrid screening for efficient lasso optimization
- Pathwise coordinate optimization for sparse learning: algorithm and theory
- Computation and analysis of change points with different jump locations in high-dimensional regression
- ROBOUT: a conditional outlier detection methodology for high-dimensional data
- Techniques for accelerating branch-and-bound algorithms dedicated to sparse optimization
- On the distribution, model selection properties and uniqueness of the Lasso estimator in low and high dimensions
- Solving a class of feature selection problems via fractional 0--1 programming
- On stochastic dynamic modeling of incidence data
- Penalized logistic regression with prior information for microarray gene expression classification
- Lasso screening rules via dual polytope projection
- Gap safe screening rules for sparsity enforcing penalties
- Generalized score matching for non-negative data
- Regularization parameter selection for the low rank matrix recovery
- scientific article; zbMATH DE number 7750672 (Why is no real title available?)
- Locally simultaneous inference
- A dimension reduction technique for large-scale structured sparse optimization problems with application to convex clustering
- scientific article; zbMATH DE number 7370571 (Why is no real title available?)
- A safe double screening strategy for elastic net support vector machine
- Safe Rules for the Identification of Zeros in the Solutions of the SLOPE Problem
- A coordinate majorization descent algorithm for \(\ell_1\) penalized learning
- A dual based semismooth Newton-type algorithm for solving large-scale sparse Tikhonov regularization problems
- Algorithms for Sparse Support Vector Machines
- THE LOW-VOLATILITY ANOMALY AND THE ADAPTIVE MULTI-FACTOR MODEL
- A bootstrap model comparison test for identifying genes with context-specific patterns of genetic regulation
- Safe feature screening rules for the regularized Huber regression
- Natural coordinate descent algorithm for \(\ell_1\)-penalised regression in generalised linear models
- Structure estimation of binary graphical models on stratified data: application to the description of injury tables for victims of road accidents
- Best subset, forward stepwise or Lasso? Analysis and recommendations based on extensive comparisons
- Group linear algorithm with sparse principal decomposition: a variable selection and clustering method for generalized linear models
- A Unified Approach to Sparse Tweedie Modeling of Multisource Insurance Claim Data
- Fast stepwise regression based on multidimensional indexes
- Sparse group fused Lasso for model segmentation: a hybrid approach
- \textsf{cmenet}: A new method for bi-level variable selection of conditional main effects
- scientific article; zbMATH DE number 7626751 (Why is no real title available?)
- Feature screening strategy for non-convex sparse logistic regression with log sum penalty
- Group penalized quantile regression
- Some notes on concordance between optimization and statistics
- Structured iterative hard thresholding with on- and off-grid applications
- Fast and approximate exhaustive variable selection for generalised linear models with APES
- Group subset selection for linear regression
- A decomposition method for Lasso problems with zero-sum constraint
- A fast unified algorithm for solving group-lasso penalize learning problems
- Quantile regression feature selection and estimation with grouped variables using Huber approximation
- PUlasso: High-Dimensional Variable Selection With Presence-Only Data
- Graphical Lasso and thresholding: equivalence and closed-form solutions
- Variable screening for Lasso based on multidimensional indexing
- Lasso regression under stochastic restrictions in linear regression: An application to genomic data
- scientific article; zbMATH DE number 7306914 (Why is no real title available?)
- A safe reinforced feature screening strategy for Lasso based on feasible solutions
- Large-scale multivariate sparse regression with applications to UK Biobank
- Screening rules and its complexity for active set identification
This page was built for publication: Strong Rules for Discarding Predictors in Lasso-Type Problems
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q5743136)