Smoothing proximal gradient method for general structured sparse regression
From MaRDI portal
Publication:439167
DOI10.1214/11-AOAS514zbMATH Open1243.62100arXiv1005.4717OpenAlexW1989060270MaRDI QIDQ439167FDOQ439167
Authors: Xi Chen, Qihang Lin, Seyoung Kim, Jaime G. Carbonell, Eric P. Xing
Publication date: 1 August 2012
Published in: The Annals of Applied Statistics (Search for Journal in Brave)
Abstract: We study the problem of estimating high-dimensional regression models regularized by a structured sparsity-inducing penalty that encodes prior structural information on either the input or output variables. We consider two widely adopted types of penalties of this kind as motivating examples: (1) the general overlapping-group-lasso penalty, generalized from the group-lasso penalty; and (2) the graph-guided-fused-lasso penalty, generalized from the fused-lasso penalty. For both types of penalties, due to their nonseparability and nonsmoothness, developing an efficient optimization method remains a challenging problem. In this paper we propose a general optimization approach, the smoothing proximal gradient (SPG) method, which can solve structured sparse regression problems with any smooth convex loss under a wide spectrum of structured sparsity-inducing penalties. Our approach combines a smoothing technique with an effective proximal gradient method. It achieves a convergence rate significantly faster than the standard first-order methods, subgradient methods, and is much more scalable than the most widely used interior-point methods. The efficiency and scalability of our method are demonstrated on both simulation experiments and real genetic data sets.
Full work available at URL: https://arxiv.org/abs/1005.4717
Recommendations
Cites Work
- A Fast Iterative Shrinkage-Thresholding Algorithm for Linear Inverse Problems
- Solving semidefinite-quadratic-linear programs using SDPT3
- Pathwise coordinate optimization
- Coordinate descent algorithms for lasso penalized regression
- Title not available (Why is that?)
- Sparsity and Smoothness Via the Fused Lasso
- Model Selection and Estimation in Regression with Grouped Variables
- The solution path of the generalized lasso
- Smooth minimization of non-smooth functions
- Proximal methods for hierarchical sparse coding
- The composite absolute penalties family for grouped and hierarchical variable selection
- Title not available (Why is that?)
- A quasi-Newton acceleration for high-dimensional optimization algorithms
- Equivalent Subgradient Versions of Hamiltonian and Euler–Lagrange Equations in Variational Analysis
- A nonsmooth version of Newton's method
- Title not available (Why is that?)
- Alternating Projection-Proximal Methods for Convex Programming and Variational Inequalities
- Title not available (Why is that?)
- Primal-dual first-order methods with \({\mathcal {O}(1/\varepsilon)}\) iteration-complexity for cone programming
- Structured variable selection with sparsity-inducing norms
- Efficient online and batch learning using forward backward splitting
- A feasible semismooth asymptotically Newton method for mixed complementarity problems
- Reconstructing DNA copy number by penalized estimation and imputation
Cited In (31)
- Smoothing composite proximal gradient algorithm for sparse group Lasso problems with nonsmooth loss functions
- Structured sparsity promoting functions
- Penalized Estimation of Frailty-Based Illness–Death Models for Semi-Competing Risks
- Regularization-based model tree for multi-output regression
- Group sparse structural smoothing recovery: model, statistical properties and algorithm
- Regularized regression when covariates are linked on a network: the 3CoSE algorithm
- Sparse regression with multi-type regularized feature modeling
- Network classification with applications to brain connectomics
- It's All Relative: Regression Analysis with Compositional Predictors
- Point process estimation with Mirror Prox algorithms
- Reducing the Complexity of Two Classes of Optimization Problems by Inexact Accelerated Proximal Gradient Method
- Selective linearization for multi-block statistical learning
- Locally Sparse Function-on-Function Regression
- Factorisable multitask quantile regression
- Proximal methods for the latent group lasso penalty
- Tree-Guided Rare Feature Selection and Logic Aggregation with Electronic Health Records Data
- Sparse group fused Lasso for model segmentation: a hybrid approach
- A graph decomposition-based approach for the graph-fused Lasso
- Easily Parallelizable and Distributable Class of Algorithms for Structured Sparsity, with Optimal Acceleration
- A Joint Fairness Model with Applications to Risk Predictions for Underrepresented Populations
- Group projected subspace pursuit for block sparse signal reconstruction: convergence analysis and applications
- Efficient inexact proximal gradient algorithms for structured sparsity-inducing norm
- Structured sparsity via alternating direction methods
- Truncated estimation in functional generalized linear regression models
- Tree-guided group lasso for multi-response regression with structured sparsity, with an application to eQTL mapping
- Heterogeneous Mediation Analysis on Epigenomic PTSD and Traumatic Stress in a Predominantly African American Cohort
- PAC-Bayesian risk bounds for group-analysis sparse regression by exponential weighting
- An Efficient Algorithm for Minimizing Multi Non-Smooth Component Functions
- Structured sparsity through convex optimization
- A smoothing stochastic gradient method for composite optimization
- A fast and efficient smoothing approach to Lasso regression and an application in statistical genetics: polygenic risk scores for chronic obstructive pulmonary disease (COPD)
Uses Software
This page was built for publication: Smoothing proximal gradient method for general structured sparse regression
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q439167)