Quadratic approximation for nonconvex penalized estimations with a diverging number of parameters
From MaRDI portal
Publication:452889
DOI10.1016/J.SPL.2012.05.012zbMATH Open1334.62129OpenAlexW2003049564MaRDI QIDQ452889FDOQ452889
Sunghoon Kwon, Yongdai Kim, Sangin Lee
Publication date: 18 September 2012
Published in: Statistics \& Probability Letters (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1016/j.spl.2012.05.012
Recommendations
- Quadratic approximation on SCAD penalized estimation
- Quadratic approximation via the SCAD penalty with a diverging number of parameters
- Nonconcave penalized likelihood with a diverging number of parameters.
- Global optimality of nonconvex penalized estimators
- Nonconcave penalized M-estimation with a diverging number of parameters
Point estimation (62F10) Linear regression; mixed models (62J05) Ridge regression; shrinkage estimators (Lasso) (62J07)
Cites Work
- Nearly unbiased variable selection under minimax concave penalty
- The Adaptive Lasso and Its Oracle Properties
- Least angle regression. (With discussion)
- Variable Selection via Nonconcave Penalized Likelihood and its Oracle Properties
- Title not available (Why is that?)
- One-step sparse estimates in nonconcave penalized likelihood models
- Robust regression: Asymptotics, conjectures and Monte Carlo
- The Concave-Convex Procedure
- A well-conditioned estimator for large-dimensional covariance matrices
- Shrinkage tuning parameter selection with a diverging number of parameters
- Unified LASSO Estimation by Least Squares Approximation
- Nonconcave penalized likelihood with a diverging number of parameters.
- Asymptotic behavior of M-estimators of p regression parameters when \(p^ 2/n\) is large. I. Consistency
- On the adaptive elastic net with a diverging number of parameters
- Asymptotic properties of bridge estimators in sparse high-dimensional regression models
- Smoothly Clipped Absolute Deviation on High Dimensions
- Asymptotic behavior of likelihood methods for exponential families when the number of parameters tends to infinity
- Profile-kernel likelihood inference with diverging number of parameters
- On parameters of increasing dimensions
- Quadratic approximation on SCAD penalized estimation
- Large sample properties of the SCAD-penalized maximum likelihood estimation on high dimen\-sions
- Global optimality of nonconvex penalized estimators
- Least squares approximation with a diverging number of parameters
Cited In (4)
This page was built for publication: Quadratic approximation for nonconvex penalized estimations with a diverging number of parameters
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q452889)