Efficient block-coordinate descent algorithms for the group Lasso
From MaRDI portal
Publication:2392933
DOI10.1007/s12532-013-0051-xzbMath1275.90059MaRDI QIDQ2392933
Donald Goldfarb, Katya Scheinberg, Zhiwei Qin
Publication date: 5 August 2013
Published in: Mathematical Programming Computation (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1007/s12532-013-0051-x
block coordinate descent; group lasso; linesearch; iterative shrinkage thresholding; multiple measurement vector
90C25: Convex programming
Related Items
A Randomized Nonmonotone Block Proximal Gradient Method for a Class of Structured Nonlinear Programming, Structured Variable Selection for Regularized Generalized Canonical Correlation Analysis, Iteration Complexity of a Block Coordinate Gradient Descent Method for Convex Optimization, Inexact coordinate descent: complexity and preconditioning, Split Bregman algorithms for multiple measurement vector problem, Practical inexact proximal quasi-Newton method with global complexity analysis, Block coordinate descent algorithms for large-scale sparse multiclass classification, Proximal methods for the latent group lasso penalty, A random coordinate descent algorithm for optimization problems with composite objective function and linear coupled constraints, On the complexity analysis of randomized block-coordinate descent methods, A flexible coordinate descent method, On the proximal Landweber Newton method for a class of nonsmooth convex problems, Parallel block coordinate minimization with application to group regularized regression, Random block coordinate descent methods for linearly constrained optimization over networks, Iteration complexity of randomized block-coordinate descent methods for minimizing a composite function, Ontology Sparse Vector Learning Algorithm for Ontology Similarity Measuring and Ontology Mapping via ADAL Technology, An alternating direction method for total variation denoising, Separable approximations and decomposition methods for the augmented Lagrangian, An Accelerated Randomized Proximal Coordinate Gradient Method and its Application to Regularized Empirical Risk Minimization
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- A Fast Iterative Shrinkage-Thresholding Algorithm for Linear Inverse Problems
- Surveying and comparing simultaneous sparse approximation (or group-lasso) algorithms
- A coordinate gradient descent method for nonsmooth separable minimization
- Efficiency of Coordinate Descent Methods on Huge-Scale Optimization Problems
- Sparse Optimization with Least-Squares Constraints
- Computing a Trust Region Step
- Robust uncertainty principles: exact signal reconstruction from highly incomplete frequency information
- The Group Lasso for Logistic Regression
- Numerical Optimization
- Sparse Reconstruction by Separable Approximation
- Theoretical Results on Sparse Representations of Multiple-Measurement Vectors
- Tackling Box-Constrained Optimization via a New Projected Quasi-Newton Approach
- Regularization and Variable Selection Via the Elastic Net
- Model Selection and Estimation in Regression with Grouped Variables
- Compressed sensing
- Convergence of a block coordinate descent method for nondifferentiable minimization
- Benchmarking optimization software with performance profiles.