A First-Order Optimization Algorithm for Statistical Learning with Hierarchical Sparsity Structure
From MaRDI portal
Publication:5086011
DOI10.1287/ijoc.2021.1069OpenAlexW3211822432MaRDI QIDQ5086011
No author found.
Publication date: 30 June 2022
Published in: INFORMS Journal on Computing (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/2001.03322
alternating direction method of multipliersproximal methodslatent overlapping group Lassoerror bound theoryhierarchical sparsity structure
Related Items
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- A Fast Iterative Shrinkage-Thresholding Algorithm for Linear Inverse Problems
- Best subset selection via a modern optimization lens
- Gradient methods for minimizing composite functions
- A lasso for hierarchical interactions
- Proximal methods for the latent group lasso penalty
- On the linear convergence of a proximal gradient method for a class of nonsmooth convex minimization problems
- On the linear convergence of the alternating direction method of multipliers
- Approximation accuracy, gradient methods, and error bound for structured convex optimization
- A coordinate gradient descent method for nonsmooth separable minimization
- Error bounds in mathematical programming
- Strong formulations for quadratic optimization with M-matrices and indicator variables
- A unified approach to error bounds for structured convex optimization problems
- Hierarchical sparse modeling: a choice of two group Lasso formulations
- Sparse high-dimensional regression: exact scalable algorithms and phase transitions
- Iteration complexity of randomized block-coordinate descent methods for minimizing a composite function
- A Unified Convergence Analysis of Block Successive Minimization Methods for Nonsmooth Optimization
- On the Convergence Rate of Dual Ascent Methods for Linearly Constrained Convex Minimization
- Group Regularized Estimation Under Structural Hierarchy
- Variable Selection Using Adaptive Nonlinear Interaction Structures in High Dimensions
- The Discrete Dantzig Selector: Estimating Sparse Linear Models via Mixed Integer Linear Optimization
- Learning with Submodular Functions: A Convex Optimization Perspective
- Model Selection and Estimation in Regression with Grouped Variables
- Convergence of a block coordinate descent method for nondifferentiable minimization
- Structured sparsity through convex optimization