Efficient block-coordinate descent algorithms for the group Lasso
From MaRDI portal
Publication:2392933
DOI10.1007/S12532-013-0051-XzbMATH Open1275.90059OpenAlexW2117575248MaRDI QIDQ2392933FDOQ2392933
Donald Goldfarb, Zhiwei (Tony) Qin, Katya Scheinberg
Publication date: 5 August 2013
Published in: Mathematical Programming Computation (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1007/s12532-013-0051-x
Recommendations
- Group coordinate descent algorithms for nonconvex penalized regression
- Coordinate descent algorithms for lasso penalized regression
- LARS-type algorithm for group Lasso
- A fast unified algorithm for solving group-lasso penalize learning problems
- Sparse optimization for nonconvex group penalized estimation
block coordinate descentlinesearchgroup lassoiterative shrinkage thresholdingmultiple measurement vector
Cites Work
- Computing a Trust Region Step
- A Fast Iterative Shrinkage-Thresholding Algorithm for Linear Inverse Problems
- Sparse Optimization with Least-Squares Constraints
- Numerical Optimization
- Title not available (Why is that?)
- Convergence of a block coordinate descent method for nondifferentiable minimization
- Regularization and Variable Selection Via the Elastic Net
- Model Selection and Estimation in Regression with Grouped Variables
- Benchmarking optimization software with performance profiles.
- The Group Lasso for Logistic Regression
- Tackling Box-Constrained Optimization via a New Projected Quasi-Newton Approach
- A coordinate gradient descent method for nonsmooth separable minimization
- Title not available (Why is that?)
- Robust uncertainty principles: exact signal reconstruction from highly incomplete frequency information
- Compressed sensing
- Sparse Reconstruction by Separable Approximation
- Efficiency of coordinate descent methods on huge-scale optimization problems
- Theoretical Results on Sparse Representations of Multiple-Measurement Vectors
- Surveying and comparing simultaneous sparse approximation (or group-lasso) algorithms
Cited In (37)
- Separable approximations and decomposition methods for the augmented Lagrangian
- LASSO for streaming data with adaptative filtering
- Block layer decomposition schemes for training deep neural networks
- Hybrid safe-strong rules for efficient optimization in Lasso-type problems
- Inexact coordinate descent: complexity and preconditioning
- A Randomized Nonmonotone Block Proximal Gradient Method for a Class of Structured Nonlinear Programming
- Smoothing composite proximal gradient algorithm for sparse group Lasso problems with nonsmooth loss functions
- A random coordinate descent algorithm for optimization problems with composite objective function and linear coupled constraints
- On the proximal Landweber Newton method for a class of nonsmooth convex problems
- Iteration Complexity of a Block Coordinate Gradient Descent Method for Convex Optimization
- Structured Variable Selection for Regularized Generalized Canonical Correlation Analysis
- A flexible coordinate descent method
- Spatially multi-scale dynamic factor modeling via sparse estimation
- Practical inexact proximal quasi-Newton method with global complexity analysis
- Locally Sparse Function-on-Function Regression
- Proximal methods for the latent group lasso penalty
- On the complexity analysis of randomized block-coordinate descent methods
- Parallel block coordinate minimization with application to group regularized regression
- Ontology sparse vector learning algorithm for ontology similarity measuring and ontology mapping via ADAL technology
- An efficient Hessian based algorithm for solving large-scale sparse group Lasso problems
- Title not available (Why is that?)
- Iteration complexity of randomized block-coordinate descent methods for minimizing a composite function
- Sparse optimization for nonconvex group penalized estimation
- Split Bregman algorithms for multiple measurement vector problem
- Block coordinate descent algorithms for large-scale sparse multiclass classification
- An Accelerated Randomized Proximal Coordinate Gradient Method and its Application to Regularized Empirical Risk Minimization
- An alternating direction method for total variation denoising
- Monitoring of group-structured high-dimensional processes via sparse group Lasso
- Group sparse optimization for learning predictive state representations
- Enhanced joint sparsity via iterative support detection
- The Group Lasso for Stable Recovery of Block-Sparse Signal Representations
- Linearly-convergent FISTA variant for composite optimization with duality
- Iterative rank-one matrix completion via singular value decomposition and nuclear norm regularization
- Varying coefficient linear discriminant analysis for dynamic data
- Empirical risk minimization: probabilistic complexity and stepsize strategy
- Random block coordinate descent methods for linearly constrained optimization over networks
- Randomness and permutations in coordinate descent methods
Uses Software
This page was built for publication: Efficient block-coordinate descent algorithms for the group Lasso
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2392933)