Selection by partitioning the solution paths
From MaRDI portal
Abstract: The performance of penalized likelihood approaches depends profoundly on the selection of the tuning parameter; however, there is no commonly agreed-upon criterion for choosing the tuning parameter. Moreover, penalized likelihood estimation based on a single value of the tuning parameter suffers from several drawbacks. This article introduces a novel approach for feature selection based on the entire solution paths rather than the choice of a single tuning parameter, which significantly improves the accuracy of the selection. Moreover, the approach allows for feature selection using ridge or other strictly convex penalties. The key idea is to classify variables as relevant or irrelevant at each tuning parameter and then to select all of the variables which have been classified as relevant at least once. We establish the theoretical properties of the method, which requires significantly weaker conditions than existing methods in the literature. We also illustrate the advantages of the proposed approach with simulation studies and a data example.
Recommendations
- Variable Selection via Nonconcave Penalized Likelihood and its Oracle Properties
- Approximate penalization path for smoothly clipped absolute deviation
- Tuning parameter selection in high dimensional penalized likelihood
- Consistent selection of tuning parameters via variable selection stability
- Coordinate descent algorithms for nonconvex penalized regression, with applications to biological feature selection
Cites work
- scientific article; zbMATH DE number 5957408 (Why is no real title available?)
- scientific article; zbMATH DE number 3444596 (Why is no real title available?)
- scientific article; zbMATH DE number 845714 (Why is no real title available?)
- A Model Selection Approach for the Identification of Quantitative Trait Loci in Experimental Crosses
- A note on the generalized information criterion for choice of a model
- Bayesian Model Averaging for Linear Regression Models
- Bayesian model averaging: A tutorial. (with comments and a rejoinder).
- Consistent selection of tuning parameters via variable selection stability
- Decoding by Linear Programming
- Estimating the dimension of a model
- Estimation in high-dimensional linear models with deterministic design matrices
- Extended Bayesian information criteria for model selection with large model spaces
- High Dimensional Variable Selection via Tilting
- High-dimensional graphs and variable selection with the Lasso
- Likelihood-based selection and sharp parameter estimation
- Model selection and estimation in the Gaussian graphical model
- Model selection in irregular problems: Applications to mapping quantitative trait loci
- Model selection principles in misspecified models
- Nonconcave penalized likelihood with a diverging number of parameters.
- Optimal predictive model selection.
- Random lasso
- Regression coefficient and autoregressive order shrinkage and selection via the lasso
- Regularization parameter selections via generalized information criterion
- Simultaneous analysis of Lasso and Dantzig selector
- Sparse inverse covariance estimation with the graphical lasso
- Sparsity and Smoothness Via the Fused Lasso
- Sparsity oracle inequalities for the Lasso
- Stability Selection
- Statistics for high-dimensional data. Methods, theory and applications.
- Sure independence screening for ultrahigh dimensional feature space. With discussion and authors' reply
- The Adaptive Lasso and Its Oracle Properties
- Tuning parameter selection in high dimensional penalized likelihood
- Variable Selection via Nonconcave Penalized Likelihood and its Oracle Properties
- Variable selection in nonparametric additive models
Cited in
(5)
This page was built for publication: Selection by partitioning the solution paths
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q114375)