In defense of LASSO
From MaRDI portal
Publication:5081041
DOI10.1080/03610926.2020.1788080OpenAlexW3041553044MaRDI QIDQ5081041
Youngjo Lee, Chi Tim Ng, Woo-Joo Lee
Publication date: 1 June 2022
Published in: Communications in Statistics - Theory and Methods (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1080/03610926.2020.1788080
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Sure independence screening in generalized linear models with NP-dimensionality
- Statistical analysis of factor models of high dimension
- Improved variable selection with forward-lasso adaptive shrinkage
- Going beyond oracle property: selection consistency and uniqueness of local solution of the generalized linear model
- The sparsity and bias of the LASSO selection in high-dimensional linear regression
- The smallest eigenvalue of a large dimensional Wishart matrix
- A limit theorem for the norm of random matrices
- CLT for linear spectral statistics of large-dimensional sample covariance matrices.
- The Lasso problem and uniqueness
- High-dimensional graphs and variable selection with the Lasso
- Shrinkage Tuning Parameter Selection with a Diverging number of Parameters
- A Selective Overview of Variable Selection in High Dimensional Feature Space (Invited Review Article)
- Model selection and estimation in the Gaussian graphical model
- Sure Independence Screening for Ultrahigh Dimensional Feature Space
- Feature Screening via Distance Correlation Learning
- Tuning parameter selectors for the smoothly clipped absolute deviation method
- Measurement Error in Nonlinear Models
- DISTRIBUTION OF EIGENVALUES FOR SOME SETS OF RANDOM MATRICES
- Tuning Parameter Selection in High Dimensional Penalized Likelihood
- A general theory of concave regularization for high-dimensional sparse estimation problems
This page was built for publication: In defense of LASSO