Going beyond oracle property: selection consistency and uniqueness of local solution of the generalized linear model
From MaRDI portal
Publication:670138
DOI10.1016/j.stamet.2016.05.006zbMath1487.62089OpenAlexW2413683033MaRDI QIDQ670138
Chi Tim Ng, Seungyoung Oh, Youngjo Lee
Publication date: 18 March 2019
Published in: Statistical Methodology (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1016/j.stamet.2016.05.006
generalized linear modelselection consistencyoracle propertySCAD penaltypenalized likelihood estimation
Asymptotic properties of parametric estimators (62F12) Ridge regression; shrinkage estimators (Lasso) (62J07) Generalized linear models (logistic models) (62J12)
Related Items
Variable selection under multicollinearity using modified log penalty ⋮ In defense of LASSO ⋮ Removing the singularity of a penalty via thresholding function matching ⋮ Properties of h‐Likelihood Estimators in Clustered Data ⋮ Hypothesis testing via a penalized-likelihood approach
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Sure independence screening in generalized linear models with NP-dimensionality
- Sparse canonical covariance analysis for high-throughput data
- High-dimensional regression with noisy and missing data: provable guarantees with nonconvexity
- One-step sparse estimates in nonconcave penalized likelihood models
- The sparsity and bias of the LASSO selection in high-dimensional linear regression
- The use of random-effect models for high-dimensional variable selection problems
- Calibrating nonconvex penalized regression in ultra-high dimension
- Strong oracle optimality of folded concave penalized estimation
- A new sparse variable selection via random-effect model
- Global optimality of nonconvex penalized estimators
- A Selective Overview of Variable Selection in High Dimensional Feature Space (Invited Review Article)
- The Concave-Convex Procedure
- Variable Selection via Nonconcave Penalized Likelihood and its Oracle Properties
- Comparison of Discrimination Methods for the Classification of Tumors Using Gene Expression Data
- Stability Selection
- Nonconcave Penalized Likelihood With NP-Dimensionality
- Regularization and Variable Selection Via the Elastic Net
- Regularized M-estimators with nonconvexity: Statistical and algorithmic theory for local optima
- The elements of statistical learning. Data mining, inference, and prediction
- A general theory of concave regularization for high-dimensional sparse estimation problems