Quadratic approximation for nonconvex penalized estimations with a diverging number of parameters
From MaRDI portal
Publication:452889
DOI10.1016/j.spl.2012.05.012zbMath1334.62129MaRDI QIDQ452889
Sunghoon Kwon, Sangin Lee, Yongdai Kim
Publication date: 18 September 2012
Published in: Statistics \& Probability Letters (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1016/j.spl.2012.05.012
quadratic approximation; nonconvex penalty; oracle property; SCAD; smoothly clipped absolute deviation
62J07: Ridge regression; shrinkage estimators (Lasso)
62J05: Linear regression; mixed models
62F10: Point estimation
Related Items
Sparse optimization for nonconvex group penalized estimation, Combined-penalized likelihood estimations with a diverging number of parameters
Cites Work
- Unnamed Item
- Nearly unbiased variable selection under minimax concave penalty
- The Adaptive Lasso and Its Oracle Properties
- A well-conditioned estimator for large-dimensional covariance matrices
- Quadratic approximation on SCAD penalized estimation
- Least squares approximation with a diverging number of parameters
- One-step sparse estimates in nonconcave penalized likelihood models
- Profile-kernel likelihood inference with diverging number of parameters
- Asymptotic behavior of M-estimators of p regression parameters when \(p^ 2/n\) is large. I. Consistency
- Asymptotic behavior of likelihood methods for exponential families when the number of parameters tends to infinity
- On parameters of increasing dimensions
- Robust regression: Asymptotics, conjectures and Monte Carlo
- Nonconcave penalized likelihood with a diverging number of parameters.
- Least angle regression. (With discussion)
- On the adaptive elastic net with a diverging number of parameters
- Asymptotic properties of bridge estimators in sparse high-dimensional regression models
- Large sample properties of the smoothly clipped absolute deviation penalized maximum likelihood estimation on high dimensions
- Global optimality of nonconvex penalized estimators
- Shrinkage Tuning Parameter Selection with a Diverging number of Parameters
- Unified LASSO Estimation by Least Squares Approximation
- The Concave-Convex Procedure
- Variable Selection via Nonconcave Penalized Likelihood and its Oracle Properties
- Smoothly Clipped Absolute Deviation on High Dimensions