Going beyond oracle property: selection consistency and uniqueness of local solution of the generalized linear model
DOI10.1016/J.STAMET.2016.05.006zbMATH Open1487.62089OpenAlexW2413683033MaRDI QIDQ670138FDOQ670138
Authors: Chi Tim Ng, Seungyoung Oh, Youngjo Lee
Publication date: 18 March 2019
Published in: Statistical Methodology (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1016/j.stamet.2016.05.006
Recommendations
- General oracle inequalities for model selection
- The consistency of variable selection for generalized linear models
- The generalized Lasso problem and uniqueness
- Bayesian model selection for generalized linear models using non-local priors
- Model Selection and Minimax Estimation in Generalized Linear Models
- Strong consistency of the maximum likelihood estimator in generalized linear models
- On the distribution, model selection properties and uniqueness of the Lasso estimator in low and high dimensions
- Oracle estimation of parametric models under boundary constraints
generalized linear modelpenalized likelihood estimationselection consistencyoracle propertySCAD penalty
Asymptotic properties of parametric estimators (62F12) Ridge regression; shrinkage estimators (Lasso) (62J07) Generalized linear models (logistic models) (62J12)
Cites Work
- Sure independence screening in generalized linear models with NP-dimensionality
- The elements of statistical learning. Data mining, inference, and prediction
- Variable Selection via Nonconcave Penalized Likelihood and its Oracle Properties
- Sparse canonical covariance analysis for high-throughput data
- Stability Selection
- Title not available (Why is that?)
- One-step sparse estimates in nonconcave penalized likelihood models
- Title not available (Why is that?)
- The Concave-Convex Procedure
- Comparison of Discrimination Methods for the Classification of Tumors Using Gene Expression Data
- Regularization and Variable Selection Via the Elastic Net
- High-dimensional regression with noisy and missing data: provable guarantees with nonconvexity
- A Selective Overview of Variable Selection in High Dimensional Feature Space (Invited Review Article)
- The sparsity and bias of the LASSO selection in high-dimensional linear regression
- A general theory of concave regularization for high-dimensional sparse estimation problems
- Calibrating nonconvex penalized regression in ultra-high dimension
- Strong oracle optimality of folded concave penalized estimation
- Nonconcave Penalized Likelihood With NP-Dimensionality
- Regularized \(M\)-estimators with nonconvexity: statistical and algorithmic theory for local optima
- Global optimality of nonconvex penalized estimators
- The use of random-effect models for high-dimensional variable selection problems
- A new sparse variable selection via random-effect model
Cited In (11)
- On Hodges' superefficiency and merits of oracle property in model selection
- Hypothesis testing via a penalized-likelihood approach
- On the strong oracle property of concave penalized estimators with infinite penalty derivative at the origin
- Model selection consistency of \(U\)-statistics with convex loss and weighted Lasso penalty
- Properties of h‐Likelihood Estimators in Clustered Data
- The asymptotic properties of SCAD penalized generalized linear models with adaptive designs
- Removing the singularity of a penalty via thresholding function matching
- Variable selection under multicollinearity using modified log penalty
- A necessary condition for the strong oracle property
- In defense of LASSO
- Global optimality of nonconvex penalized estimators
Uses Software
This page was built for publication: Going beyond oracle property: selection consistency and uniqueness of local solution of the generalized linear model
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q670138)