Efficient block-coordinate descent algorithms for the group Lasso
From MaRDI portal
Publication:2392933
DOI10.1007/s12532-013-0051-xzbMath1275.90059OpenAlexW2117575248MaRDI QIDQ2392933
Donald Goldfarb, Zhiwei Qin, Katya Scheinberg
Publication date: 5 August 2013
Published in: Mathematical Programming Computation (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1007/s12532-013-0051-x
block coordinate descentgroup lassolinesearchiterative shrinkage thresholdingmultiple measurement vector
Related Items
Parallel block coordinate minimization with application to group regularized regression, Structured Variable Selection for Regularized Generalized Canonical Correlation Analysis, Hybrid safe-strong rules for efficient optimization in Lasso-type problems, An Accelerated Randomized Proximal Coordinate Gradient Method and its Application to Regularized Empirical Risk Minimization, Inexact coordinate descent: complexity and preconditioning, A flexible coordinate descent method, Split Bregman algorithms for multiple measurement vector problem, Practical inexact proximal quasi-Newton method with global complexity analysis, Random block coordinate descent methods for linearly constrained optimization over networks, Block layer decomposition schemes for training deep neural networks, An alternating direction method for total variation denoising, Separable approximations and decomposition methods for the augmented Lagrangian, Block coordinate descent algorithms for large-scale sparse multiclass classification, Empirical risk minimization: probabilistic complexity and stepsize strategy, Randomness and permutations in coordinate descent methods, Iterative rank-one matrix completion via singular value decomposition and nuclear norm regularization, Linearly-convergent FISTA variant for composite optimization with duality, A Randomized Nonmonotone Block Proximal Gradient Method for a Class of Structured Nonlinear Programming, Locally Sparse Function-on-Function Regression, Proximal methods for the latent group lasso penalty, A random coordinate descent algorithm for optimization problems with composite objective function and linear coupled constraints, Iteration complexity of randomized block-coordinate descent methods for minimizing a composite function, On the complexity analysis of randomized block-coordinate descent methods, Ontology Sparse Vector Learning Algorithm for Ontology Similarity Measuring and Ontology Mapping via ADAL Technology, An efficient Hessian based algorithm for solving large-scale sparse group Lasso problems, Sparse optimization for nonconvex group penalized estimation, Group sparse optimization for learning predictive state representations, Enhanced joint sparsity via iterative support detection, Spatially multi-scale dynamic factor modeling via sparse estimation, Iteration Complexity of a Block Coordinate Gradient Descent Method for Convex Optimization, Varying coefficient linear discriminant analysis for dynamic data, Unnamed Item, On the proximal Landweber Newton method for a class of nonsmooth convex problems, LASSO for streaming data with adaptative filtering
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- A Fast Iterative Shrinkage-Thresholding Algorithm for Linear Inverse Problems
- Surveying and comparing simultaneous sparse approximation (or group-lasso) algorithms
- A coordinate gradient descent method for nonsmooth separable minimization
- Efficiency of Coordinate Descent Methods on Huge-Scale Optimization Problems
- Sparse Optimization with Least-Squares Constraints
- Computing a Trust Region Step
- Robust uncertainty principles: exact signal reconstruction from highly incomplete frequency information
- The Group Lasso for Logistic Regression
- Numerical Optimization
- Sparse Reconstruction by Separable Approximation
- Theoretical Results on Sparse Representations of Multiple-Measurement Vectors
- Tackling Box-Constrained Optimization via a New Projected Quasi-Newton Approach
- Regularization and Variable Selection Via the Elastic Net
- Model Selection and Estimation in Regression with Grouped Variables
- Compressed sensing
- Convergence of a block coordinate descent method for nondifferentiable minimization
- Benchmarking optimization software with performance profiles.