Efficient block-coordinate descent algorithms for the group Lasso

From MaRDI portal
Revision as of 20:15, 2 February 2024 by Import240129110113 (talk | contribs) (Created automatically from import240129110113)
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)

Publication:2392933


DOI10.1007/s12532-013-0051-xzbMath1275.90059MaRDI QIDQ2392933

Donald Goldfarb, Zhiwei Qin, Katya Scheinberg

Publication date: 5 August 2013

Published in: Mathematical Programming Computation (Search for Journal in Brave)

Full work available at URL: https://doi.org/10.1007/s12532-013-0051-x


90C25: Convex programming


Related Items

A Randomized Nonmonotone Block Proximal Gradient Method for a Class of Structured Nonlinear Programming, Spatially multi-scale dynamic factor modeling via sparse estimation, Unnamed Item, Sparse optimization for nonconvex group penalized estimation, Structured Variable Selection for Regularized Generalized Canonical Correlation Analysis, Iteration Complexity of a Block Coordinate Gradient Descent Method for Convex Optimization, Iterative rank-one matrix completion via singular value decomposition and nuclear norm regularization, Linearly-convergent FISTA variant for composite optimization with duality, Inexact coordinate descent: complexity and preconditioning, Split Bregman algorithms for multiple measurement vector problem, Practical inexact proximal quasi-Newton method with global complexity analysis, Block coordinate descent algorithms for large-scale sparse multiclass classification, Proximal methods for the latent group lasso penalty, A random coordinate descent algorithm for optimization problems with composite objective function and linear coupled constraints, On the complexity analysis of randomized block-coordinate descent methods, Group sparse optimization for learning predictive state representations, Enhanced joint sparsity via iterative support detection, Hybrid safe-strong rules for efficient optimization in Lasso-type problems, A flexible coordinate descent method, Varying coefficient linear discriminant analysis for dynamic data, LASSO for streaming data with adaptative filtering, Block layer decomposition schemes for training deep neural networks, Randomness and permutations in coordinate descent methods, An efficient Hessian based algorithm for solving large-scale sparse group Lasso problems, On the proximal Landweber Newton method for a class of nonsmooth convex problems, Parallel block coordinate minimization with application to group regularized regression, Random block coordinate descent methods for linearly constrained optimization over networks, Empirical risk minimization: probabilistic complexity and stepsize strategy, Iteration complexity of randomized block-coordinate descent methods for minimizing a composite function, Ontology Sparse Vector Learning Algorithm for Ontology Similarity Measuring and Ontology Mapping via ADAL Technology, An alternating direction method for total variation denoising, Separable approximations and decomposition methods for the augmented Lagrangian, An Accelerated Randomized Proximal Coordinate Gradient Method and its Application to Regularized Empirical Risk Minimization


Uses Software


Cites Work