A scalable surrogate L₀ sparse regression method for generalized linear models with applications to large scale data
DOI10.1016/J.JSPI.2020.12.001zbMATH Open1465.62132OpenAlexW3112516299MaRDI QIDQ830734FDOQ830734
Authors: Xiaoling Peng, Eric Kawaguchi, Marc A. Suchard, Gang Li, Ning Li
Publication date: 7 May 2021
Published in: Journal of Statistical Planning and Inference (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1016/j.jspi.2020.12.001
Recommendations
- scientific article; zbMATH DE number 6982301
- Broken adaptive ridge regression and its asymptotic properties
- An attention algorithm for solving large scale structured \(l_0\)-norm penalty estimation problems
- Variable selection and estimation in generalized linear models with the seamless \(L_0\) penalty
- Adaptive Lasso for sparse high-dimensional regression models
variable selectionridge regressiongeneralized linear models\( L_0\) penaltyhigh-dimensional massive sample size data
Applications of statistics to biology and medical sciences; meta analysis (62P10) Ridge regression; shrinkage estimators (Lasso) (62J07) Generalized linear models (logistic models) (62J12)
Cites Work
- Sure independence screening in generalized linear models with NP-dimensionality
- Model-free feature screening for ultrahigh-dimensional data
- Estimating the dimension of a model
- Heuristics of instability and stabilization in model selection
- Coordinate descent algorithms for lasso penalized regression
- Extended Bayesian information criteria for model selection with large model spaces
- Some Comments on C P
- A new look at the statistical model identification
- Sure Independence Screening for Ultrahigh Dimensional Feature Space
- Feature screening via distance correlation learning
- A significance test for the lasso
- Exact post-selection inference, with application to the Lasso
- The central role of the propensity score in observational studies for causal effects
- Feature selection for varying coefficient models with ultrahigh-dimensional covariates
- Quantile-adaptive model-free variable screening for high-dimensional heterogeneous data
- Nonparametric independence screening in sparse ultra-high-dimensional additive models
- Nonconcave penalized likelihood with a diverging number of parameters.
- The risk inflation criterion for multiple regression
- Likelihood-based selection and sharp parameter estimation
- Ultrahigh dimensional feature selection: beyond the linear model
- Strong oracle optimality of folded concave penalized estimation
- Forward regression for ultra-high dimensional variable screening
- Robust rank correlation based screening
- Title not available (Why is that?)
- Regularized quantile regression and robust feature screening for single index models
- Model free feature screening for ultrahigh dimensional data with responses missing at random
- Adaptive conditional feature screening
- Broken adaptive ridge regression and its asymptotic properties
- Efficient regularized regression with \(L_0\) penalty for variable selection and network construction
- Massive parallelization of serial inference algorithms for a complex generalized linear model
- The sparse MLE for ultrahigh-dimensional feature screening
Cited In (3)
Uses Software
This page was built for publication: A scalable surrogate \(L_0\) sparse regression method for generalized linear models with applications to large scale data
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q830734)