Variable selection for sparse logistic regression
From MaRDI portal
Publication:2202033
DOI10.1007/s00184-020-00764-4zbMath1450.62089OpenAlexW3005100705MaRDI QIDQ2202033
Publication date: 17 September 2020
Published in: Metrika (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1007/s00184-020-00764-4
Ridge regression; shrinkage estimators (Lasso) (62J07) Generalized linear models (logistic models) (62J12)
Related Items
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- The Adaptive Lasso and Its Oracle Properties
- Statistics for high-dimensional data. Methods, theory and applications.
- Berry-Esseen type estimates for large deviation probabilities
- Honest variable selection in linear and logistic regression models via \(\ell _{1}\) and \(\ell _{1}+\ell _{2}\) penalization
- Self-concordant analysis for logistic regression
- Simultaneous analysis of Lasso and Dantzig selector
- High-dimensional generalized linear models and the lasso
- The Dantzig selector: statistical estimation when \(p\) is much larger than \(n\). (With discussions and rejoinder).
- Non-asymptotic oracle inequalities for the Lasso and Group Lasso in high dimensional logistic model
- Oracle Inequalities for a Group Lasso Procedure Applied to Generalized Linear Models in High Dimension
- Square-root lasso: pivotal recovery of sparse signals via conic programming
- The Group Lasso for Logistic Regression
- Variable Selection via Nonconcave Penalized Likelihood and its Oracle Properties
- Consistent Functional Methods for Logistic Regression With Errors in Covariates
- Regularization and Variable Selection Via the Elastic Net
- Model Selection and Estimation in Regression with Grouped Variables
- A unified framework for high-dimensional analysis of \(M\)-estimators with decomposable regularizers
This page was built for publication: Variable selection for sparse logistic regression