GSDAR: a fast Newton algorithm for \(\ell_0\) regularized generalized linear models with statistical guarantee
From MaRDI portal
Publication:2135875
DOI10.1007/s00180-021-01098-zzbMath1505.62193OpenAlexW3146025976MaRDI QIDQ2135875
Yan Yan Liu, Jian Huang, Xiliang Lu, Lican Kang, Jin Liu, Yu Ling Jiao
Publication date: 10 May 2022
Published in: Computational Statistics (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1007/s00180-021-01098-z
sparse learningestimation error\( \ell_0\)-penaltysupport detectionhigh-dimensional generalized linear models
Computational methods for problems pertaining to statistics (62-08) Generalized linear models (logistic models) (62J12)
Related Items
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Coordinate descent algorithms for nonconvex penalized regression, with applications to biological feature selection
- Nearly unbiased variable selection under minimax concave penalty
- Gradient methods for minimizing composite functions
- Optimal computational and statistical rates of convergence for sparse nonconvex learning problems
- Generalized linear models
- Estimating the dimension of a model
- A unified primal dual active set algorithm for nonconvex sparse recovery
- Transformed \(\ell_1\) regularization for learning sparse deep neural networks
- Modified versions of the Bayesian information criterion for sparse generalized linear models
- A nonsmooth version of Newton's method
- High-dimensional generalized linear models and the lasso
- Complexity of unconstrained \(L_2 - L_p\) minimization
- Calibrating nonconvex penalized regression in ultra-high dimension
- Adapting to unknown sparsity by controlling the false discovery rate
- Phenotypes and genotypes. The search for influential genes
- Extended BIC for small-n-large-P sparse GLM
- Extended Bayesian information criteria for model selection with large model spaces
- The Group Lasso for Logistic Regression
- Variable Selection via Nonconcave Penalized Likelihood and its Oracle Properties
- Smoothing Methods and Semismooth Methods for Nondifferentiable Operator Equations
- Group Sparse Recovery via the $\ell ^0(\ell ^2)$ Penalty: Theory and Algorithm
- Sure Independence Screening for Ultrahigh Dimensional Feature Space
- Convergence Analysis of Some Algorithms for Solving Nonsmooth Equations
- Sparse Approximate Solutions to Linear Systems
- Sequential Lasso Cum EBIC for Feature Selection With Ultra-High Dimensional Feature Space
- L 1-Regularization Path Algorithm for Generalized Linear Models
- Regularization and Variable Selection Via the Elastic Net
- Regularized M-estimators with nonconvexity: Statistical and algorithmic theory for local optima
- Gaussian model selection
- Group descent algorithms for nonconvex penalized linear and logistic regression models with grouped predictors
- A general theory of concave regularization for high-dimensional sparse estimation problems