Subspace quadratic regularization method for group sparse multinomial logistic regression
DOI10.1007/S10589-021-00287-2zbMATH Open1472.62122OpenAlexW3169471969MaRDI QIDQ2044487FDOQ2044487
Authors: Rui Wang, Kim-Chuan Toh, Naihua Xiu
Publication date: 9 August 2021
Published in: Computational Optimization and Applications (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1007/s10589-021-00287-2
Recommendations
- Sparse group Lasso and high dimensional multinomial classification
- An extended Newton-type algorithm for \(\ell_2\)-regularized sparse logistic regression and its efficiency for classifying large-scale datasets
- A Subspace Acceleration Method for Minimization Involving a Group Sparsity-Inducing Regularizer
- Alternating direction method of multipliers for \(\ell_{1}\)-\(\ell_{2}\)-regularized logistic regression model
- An improved OWL-QN method for solving sparse logistic regression problems
global convergencenumerical experimentlocally quadratic convergencequadratic regularization methodsparse multinomial logistic regression
Computational methods for problems pertaining to statistics (62-08) Generalized linear models (logistic models) (62J12)
Cites Work
- Computing a Trust Region Step
- Numerical Optimization
- Multinomial logistic regression algorithm
- Title not available (Why is that?)
- Variational Analysis
- Regression modeling strategies. With applications to linear models, logistic regression, and survival analysis
- Sparse group Lasso and high dimensional multinomial classification
- Sparse Approximate Solutions to Linear Systems
- Greedy sparsity-constrained optimization
- A QP-free constrained Newton-type method for variational inequality problems
- Block coordinate descent algorithms for large-scale sparse multiclass classification
- Sparse Multinomial Logistic Regression via Approximate Message Passing
- Variable selection in general multinomial logit models
- Proximal Newton-type methods for minimizing composite functions
- Majorization-Minimization Algorithms in Signal Processing, Communications, and Machine Learning
- Statistical Models
- Group descent algorithms for nonconvex penalized linear and logistic regression models with grouped predictors
- Optimization problems involving group sparsity terms
- An extended Newton-type algorithm for \(\ell_2\)-regularized sparse logistic regression and its efficiency for classifying large-scale datasets
Cited In (9)
- Sparse group Lasso and high dimensional multinomial classification
- A splitting augmented Lagrangian method embedding in the BB method for solving the sparse logistic problem
- Iteratively reweighted group Lasso based on log-composite regularization
- An extended Newton-type algorithm for \(\ell_2\)-regularized sparse logistic regression and its efficiency for classifying large-scale datasets
- Optimality conditions for Tucker low-rank tensor optimization
- \(\ell_{2,0}\)-norm based selection and estimation for multivariate generalized linear models
- Relaxation quadratic approximation greedy pursuit method based on sparse learning
- An improved OWL-QN method for solving sparse logistic regression problems
- A Subspace Acceleration Method for Minimization Involving a Group Sparsity-Inducing Regularizer
Uses Software
This page was built for publication: Subspace quadratic regularization method for group sparse multinomial logistic regression
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2044487)