Variable selection and estimation using a continuous approximation to the \(L_0\) penalty
From MaRDI portal
Publication:1695760
DOI10.1007/s10463-016-0588-3zbMath1385.62019OpenAlexW2535135366MaRDI QIDQ1695760
Yanxin Wang, Qibin Fan, L. Zhu
Publication date: 8 February 2018
Published in: Annals of the Institute of Statistical Mathematics (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1007/s10463-016-0588-3
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Coordinate descent algorithms for nonconvex penalized regression, with applications to biological feature selection
- The group exponential lasso for bi-level variable selection
- Nearly unbiased variable selection under minimax concave penalty
- A unified approach to model selection and sparse recovery using regularized least squares
- The Adaptive Lasso and Its Oracle Properties
- Estimating the dimension of a model
- Heuristics of instability and stabilization in model selection
- Asymptotics for Lasso-type estimators.
- Nonconcave penalized likelihood with a diverging number of parameters.
- The risk inflation criterion for multiple regression
- On the adaptive elastic net with a diverging number of parameters
- Pathwise coordinate optimization
- On the ``degrees of freedom of the lasso
- Coordinate descent algorithms for lasso penalized regression
- Shrinkage Tuning Parameter Selection with a Diverging number of Parameters
- Better Subset Regression Using the Nonnegative Garrote
- Unified LASSO Estimation by Least Squares Approximation
- Variable Selection via Nonconcave Penalized Likelihood and its Oracle Properties
- An iterative thresholding algorithm for linear inverse problems with a sparsity constraint
- A Statistical View of Some Chemometrics Regression Tools
- Model Selection via Bayesian Information Criterion for Quantile Regression Models
- Nonconcave Penalized Likelihood With NP-Dimensionality
- Linear Model Selection by Cross-Validation
- Regularization and Variable Selection Via the Elastic Net
- Smoothly Clipped Absolute Deviation on High Dimensions
- Tuning parameter selectors for the smoothly clipped absolute deviation method
This page was built for publication: Variable selection and estimation using a continuous approximation to the \(L_0\) penalty