A primal and dual active set algorithm for truncated \(L_1\) regularized logistic regression
From MaRDI portal
Publication:2691265
DOI10.3934/jimo.2022050OpenAlexW4226520567MaRDI QIDQ2691265
Yuan Luo, Lican Kang, Chang Zhu, Jerry Zhijian Yang
Publication date: 29 March 2023
Published in: Journal of Industrial and Management Optimization (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.3934/jimo.2022050
Ridge regression; shrinkage estimators (Lasso) (62J07) Generalized linear models (logistic models) (62J12) Nonconvex programming, global optimization (90C26)
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Coordinate descent algorithms for nonconvex penalized regression, with applications to biological feature selection
- Gradient methods for minimizing composite functions
- Optimal computational and statistical rates of convergence for sparse nonconvex learning problems
- Logistic regression. A self-learning text. With contributions by Erica Rihl Pryor.
- A unified primal dual active set algorithm for nonconvex sparse recovery
- GSDAR: a fast Newton algorithm for \(\ell_0\) regularized generalized linear models with statistical guarantee
- High-dimensional generalized linear models and the lasso
- Calibrating nonconvex penalized regression in ultra-high dimension
- Variational Analysis
- Variable Selection via Nonconcave Penalized Likelihood and its Oracle Properties
- Applied Logistic Regression
- Sequential Lasso Cum EBIC for Feature Selection With Ultra-High Dimensional Feature Space
- L 1-Regularization Path Algorithm for Generalized Linear Models
- Comments on «Wavelets in statistics: A review» by A. Antoniadis
- Regularization and Variable Selection Via the Elastic Net
- Regularized M-estimators with nonconvexity: Statistical and algorithmic theory for local optima
This page was built for publication: A primal and dual active set algorithm for truncated \(L_1\) regularized logistic regression