Smoothing proximal gradient method for general structured sparse regression
From MaRDI portal
Abstract: We study the problem of estimating high-dimensional regression models regularized by a structured sparsity-inducing penalty that encodes prior structural information on either the input or output variables. We consider two widely adopted types of penalties of this kind as motivating examples: (1) the general overlapping-group-lasso penalty, generalized from the group-lasso penalty; and (2) the graph-guided-fused-lasso penalty, generalized from the fused-lasso penalty. For both types of penalties, due to their nonseparability and nonsmoothness, developing an efficient optimization method remains a challenging problem. In this paper we propose a general optimization approach, the smoothing proximal gradient (SPG) method, which can solve structured sparse regression problems with any smooth convex loss under a wide spectrum of structured sparsity-inducing penalties. Our approach combines a smoothing technique with an effective proximal gradient method. It achieves a convergence rate significantly faster than the standard first-order methods, subgradient methods, and is much more scalable than the most widely used interior-point methods. The efficiency and scalability of our method are demonstrated on both simulation experiments and real genetic data sets.
Recommendations
Cites work
- scientific article; zbMATH DE number 1818892 (Why is no real title available?)
- scientific article; zbMATH DE number 2107186 (Why is no real title available?)
- scientific article; zbMATH DE number 845714 (Why is no real title available?)
- scientific article; zbMATH DE number 6253954 (Why is no real title available?)
- A Fast Iterative Shrinkage-Thresholding Algorithm for Linear Inverse Problems
- A feasible semismooth asymptotically Newton method for mixed complementarity problems
- A nonsmooth version of Newton's method
- A quasi-Newton acceleration for high-dimensional optimization algorithms
- Alternating Projection-Proximal Methods for Convex Programming and Variational Inequalities
- Coordinate descent algorithms for lasso penalized regression
- Efficient online and batch learning using forward backward splitting
- Equivalent Subgradient Versions of Hamiltonian and Euler–Lagrange Equations in Variational Analysis
- Model Selection and Estimation in Regression with Grouped Variables
- Pathwise coordinate optimization
- Primal-dual first-order methods with \({\mathcal {O}(1/\varepsilon)}\) iteration-complexity for cone programming
- Proximal methods for hierarchical sparse coding
- Reconstructing DNA copy number by penalized estimation and imputation
- Smooth minimization of non-smooth functions
- Solving semidefinite-quadratic-linear programs using SDPT3
- Sparsity and Smoothness Via the Fused Lasso
- Structured variable selection with sparsity-inducing norms
- The composite absolute penalties family for grouped and hierarchical variable selection
- The solution path of the generalized lasso
Cited in
(31)- Efficient inexact proximal gradient algorithms for structured sparsity-inducing norm
- Structured sparsity via alternating direction methods
- Reducing the Complexity of Two Classes of Optimization Problems by Inexact Accelerated Proximal Gradient Method
- Locally Sparse Function-on-Function Regression
- An Efficient Algorithm for Minimizing Multi Non-Smooth Component Functions
- Group projected subspace pursuit for block sparse signal reconstruction: convergence analysis and applications
- Network classification with applications to brain connectomics
- Structured sparsity promoting functions
- Sparse group fused Lasso for model segmentation: a hybrid approach
- Selective linearization for multi-block statistical learning
- Truncated estimation in functional generalized linear regression models
- It's All Relative: Regression Analysis with Compositional Predictors
- Regularized regression when covariates are linked on a network: the 3CoSE algorithm
- Tree-guided group lasso for multi-response regression with structured sparsity, with an application to eQTL mapping
- Point process estimation with Mirror Prox algorithms
- Group sparse structural smoothing recovery: model, statistical properties and algorithm
- A fast and efficient smoothing approach to Lasso regression and an application in statistical genetics: polygenic risk scores for chronic obstructive pulmonary disease (COPD)
- A smoothing stochastic gradient method for composite optimization
- Structured sparsity through convex optimization
- Penalized Estimation of Frailty-Based Illness–Death Models for Semi-Competing Risks
- Sparse regression with multi-type regularized feature modeling
- Heterogeneous Mediation Analysis on Epigenomic PTSD and Traumatic Stress in a Predominantly African American Cohort
- A graph decomposition-based approach for the graph-fused Lasso
- Factorisable multitask quantile regression
- PAC-Bayesian risk bounds for group-analysis sparse regression by exponential weighting
- Tree-Guided Rare Feature Selection and Logic Aggregation with Electronic Health Records Data
- Easily Parallelizable and Distributable Class of Algorithms for Structured Sparsity, with Optimal Acceleration
- Smoothing composite proximal gradient algorithm for sparse group Lasso problems with nonsmooth loss functions
- A Joint Fairness Model with Applications to Risk Predictions for Underrepresented Populations
- Proximal methods for the latent group lasso penalty
- Regularization-based model tree for multi-output regression
This page was built for publication: Smoothing proximal gradient method for general structured sparse regression
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q439167)