Strong Rules for Discarding Predictors in Lasso-Type Problems
From MaRDI portal
Publication:5743136
DOI10.1111/j.1467-9868.2011.01004.xzbMath1411.62213arXiv1011.2234OpenAlexW2131060185WikidataQ34674446 ScholiaQ34674446MaRDI QIDQ5743136
Jacob Bien, Ryan J. Tibshirani, Jonathan E. Taylor, Robert Tibshirani, Noah Robin Simon, Trevor Hastie, Jerome H. Friedman
Publication date: 9 May 2019
Published in: Journal of the Royal Statistical Society Series B: Statistical Methodology (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1011.2234
Ridge regression; shrinkage estimators (Lasso) (62J07) Numerical mathematical programming methods (65K05)
Related Items
Safe Rules for the Identification of Zeros in the Solutions of the SLOPE Problem, An attention algorithm for solving large scale structured \(l_0\)-norm penalty estimation problems, Nonsmoothness in machine learning: specific structure, proximal identification, and applications, Group subset selection for linear regression, Hybrid safe-strong rules for efficient optimization in Lasso-type problems, A Pliable Lasso, Unnamed Item, Unnamed Item, Data shared Lasso: a novel tool to discover uplift, Natural coordinate descent algorithm for \(\ell_1\)-penalised regression in generalised linear models, Structured variable selection via prior-induced hierarchical penalty functions, Safe feature screening rules for the regularized Huber regression, Large-scale multivariate sparse regression with applications to UK Biobank, Adaptive hybrid screening for efficient lasso optimization, Fast stepwise regression based on multidimensional indexes, On the distribution, model selection properties and uniqueness of the Lasso estimator in low and high dimensions, Group linear algorithm with sparse principal decomposition: a variable selection and clustering method for generalized linear models, Estimation of semiparametric regression model with right-censored high-dimensional data, Unnamed Item, Screening Rules and its Complexity for Active Set Identification, A safe double screening strategy for elastic net support vector machine, Algorithms for Sparse Support Vector Machines, THE LOW-VOLATILITY ANOMALY AND THE ADAPTIVE MULTI-FACTOR MODEL, A safe reinforced feature screening strategy for Lasso based on feasible solutions, Fast and approximate exhaustive variable selection for generalised linear models with APES, Unnamed Item, Best subset, forward stepwise or Lasso? Analysis and recommendations based on extensive comparisons, PUlasso: High-Dimensional Variable Selection With Presence-Only Data, Solving a class of feature selection problems via fractional 0--1 programming, Unnamed Item, Unnamed Item, Pathwise coordinate optimization for sparse learning: algorithm and theory, On Faster Convergence of Cyclic Block Coordinate Descent-type Methods for Strongly Convex Minimization, A dual based semismooth Newton-type algorithm for solving large-scale sparse Tikhonov regularization problems, A fast unified algorithm for solving group-lasso penalize learning problems, Regularization parameter selection for the low rank matrix recovery, Some notes on concordance between optimization and statistics, Sparse group fused Lasso for model segmentation: a hybrid approach, A coordinate majorization descent algorithm for ℓ1penalized learning, Unnamed Item, cmenet: A New Method for Bi-Level Variable Selection of Conditional Main Effects, Graphical models for zero-inflated single cell gene expression, Gap Safe screening rules for sparsity enforcing penalties, Structured iterative hard thresholding with on- and off-grid applications, Fast Best Subset Selection: Coordinate Descent and Local Combinatorial Optimization Algorithms, Quantile regression feature selection and estimation with grouped variables using Huber approximation, Group penalized quantile regression, Unnamed Item, Unnamed Item, A decomposition method for Lasso problems with zero-sum constraint, A Dimension Reduction Technique for Large-Scale Structured Sparse Optimization Problems with Application to Convex Clustering