In defense of LASSO
From MaRDI portal
Publication:5081041
Cites work
- scientific article; zbMATH DE number 5957408 (Why is no real title available?)
- scientific article; zbMATH DE number 3665891 (Why is no real title available?)
- scientific article; zbMATH DE number 1347881 (Why is no real title available?)
- scientific article; zbMATH DE number 845714 (Why is no real title available?)
- A Selective Overview of Variable Selection in High Dimensional Feature Space (Invited Review Article)
- A general theory of concave regularization for high-dimensional sparse estimation problems
- A limit theorem for the norm of random matrices
- CLT for linear spectral statistics of large-dimensional sample covariance matrices.
- DISTRIBUTION OF EIGENVALUES FOR SOME SETS OF RANDOM MATRICES
- Feature screening via distance correlation learning
- Going beyond oracle property: selection consistency and uniqueness of local solution of the generalized linear model
- High-dimensional graphs and variable selection with the Lasso
- Improved variable selection with forward-lasso adaptive shrinkage
- Measurement Error in Nonlinear Models
- Model selection and estimation in the Gaussian graphical model
- Shrinkage tuning parameter selection with a diverging number of parameters
- Statistical analysis of factor models of high dimension
- Sure independence screening for ultrahigh dimensional feature space. With discussion and authors' reply
- Sure independence screening in generalized linear models with NP-dimensionality
- The Lasso problem and uniqueness
- The smallest eigenvalue of a large dimensional Wishart matrix
- The sparsity and bias of the LASSO selection in high-dimensional linear regression
- Tuning parameter selection in high dimensional penalized likelihood
- Tuning parameter selectors for the smoothly clipped absolute deviation method
This page was built for publication: In defense of LASSO
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q5081041)