Accelerated block-coordinate relaxation for regularized optimization
From MaRDI portal
Publication:2902875
DOI10.1137/100808563zbMATH Open1357.49105OpenAlexW2060777387MaRDI QIDQ2902875FDOQ2902875
Authors: Stephen J. Wright
Publication date: 22 August 2012
Published in: SIAM Journal on Optimization (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1137/100808563
Recommendations
- Block-proximal methods with spatially adapted acceleration
- Accelerating block coordinate descent methods with identification strategies
- scientific article; zbMATH DE number 1762841
- Block-coordinate gradient descent method for linearly constrained nonsmooth separable optimization
- Block coordinate descent for smooth nonconvex constrained minimization
- Block coordinate descent methods for semidefinite programming
- scientific article; zbMATH DE number 1215271
- A fast block coordinate descent method for solving linear least-squares problems
- Accelerated primal-dual proximal block coordinate updating methods for constrained convex optimization
- Accelerated Stochastic Algorithms for Nonconvex Finite-Sum and Multiblock Optimization
Sensitivity, stability, parametric optimization (90C31) Sensitivity, stability, well-posedness (49K40) Decomposition methods (49M27)
Cited In (34)
- Inexact variable metric stochastic block-coordinate descent for regularized optimization
- Active-set identification with complexity guarantees of an almost cyclic 2-coordinate descent method with Armijo line search
- Stochastic block-coordinate gradient projection algorithms for submodular maximization
- A second-order method for strongly convex \(\ell _1\)-regularization problems
- Inexact coordinate descent: complexity and preconditioning
- Convergence of the augmented decomposition algorithm
- Title not available (Why is that?)
- A fast active set block coordinate descent algorithm for \(\ell_1\)-regularized least squares
- Second-order orthant-based methods with enriched Hessian information for sparse \(\ell _1\)-optimization
- An accelerated randomized proximal coordinate gradient method and its application to regularized empirical risk minimization
- Accelerated primal-dual proximal block coordinate updating methods for constrained convex optimization
- Accelerating inexact successive quadratic approximation for regularized optimization through manifold identification
- Regularized optimization with spatial coupling for robust decision making
- A flexible coordinate descent method
- ``Active-set complexity of proximal gradient: how long does it take to find the sparsity pattern?
- First-order Methods for the Impatient: Support Identification in Finite Time with Convergent Frank--Wolfe Variants
- Local linear convergence of proximal coordinate descent algorithm
- On the complexity analysis of randomized block-coordinate descent methods
- An adaptive accelerated proximal gradient method and its homotopy continuation for sparse optimization
- Proximal gradient methods with adaptive subspace sampling
- Performance of first- and second-order methods for \(\ell_1\)-regularized least squares problems
- A randomized nonmonotone block proximal gradient method for a class of structured nonlinear programming
- Block coordinate descent algorithms for large-scale sparse multiclass classification
- Block coordinate proximal gradient methods with variable Bregman functions for nonsmooth separable optimization
- Acceleration of block coordinate descent method achieves the $\bm{O(\frac{1}{k^2})}$ rate of convergence for a convex function with block coordinate strong convexity
- Feature selection in SVM via polyhedral \(k\)-norm
- On the Global Convergence of Randomized Coordinate Gradient Descent for Nonconvex Optimization
- Title not available (Why is that?)
- A Subspace Acceleration Method for Minimization Involving a Group Sparsity-Inducing Regularizer
- Sample size selection in optimization methods for machine learning
- Random block coordinate descent methods for linearly constrained optimization over networks
- On the complexity of parallel coordinate descent
- Randomized block proximal damped Newton method for composite self-concordant minimization
- Separating variables to accelerate non-convex regularized optimization
Uses Software
This page was built for publication: Accelerated block-coordinate relaxation for regularized optimization
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2902875)