Accelerated Block-coordinate Relaxation for Regularized Optimization
From MaRDI portal
Publication:2902875
DOI10.1137/100808563zbMath1357.49105OpenAlexW2060777387MaRDI QIDQ2902875
Publication date: 22 August 2012
Published in: SIAM Journal on Optimization (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1137/100808563
Sensitivity, stability, well-posedness (49K40) Sensitivity, stability, parametric optimization (90C31) Decomposition methods (49M27)
Related Items (29)
Performance of first- and second-order methods for \(\ell_1\)-regularized least squares problems ⋮ An Accelerated Randomized Proximal Coordinate Gradient Method and its Application to Regularized Empirical Risk Minimization ⋮ Inexact coordinate descent: complexity and preconditioning ⋮ A flexible coordinate descent method ⋮ A Subspace Acceleration Method for Minimization Involving a Group Sparsity-Inducing Regularizer ⋮ Block coordinate proximal gradient methods with variable Bregman functions for nonsmooth separable optimization ⋮ Active-Set Identification with Complexity Guarantees of an Almost Cyclic 2-Coordinate Descent Method with Armijo Line Search ⋮ Random block coordinate descent methods for linearly constrained optimization over networks ⋮ Block coordinate descent algorithms for large-scale sparse multiclass classification ⋮ Randomized Block Proximal Damped Newton Method for Composite Self-Concordant Minimization ⋮ A Randomized Nonmonotone Block Proximal Gradient Method for a Class of Structured Nonlinear Programming ⋮ On the Global Convergence of Randomized Coordinate Gradient Descent for Nonconvex Optimization ⋮ Accelerating inexact successive quadratic approximation for regularized optimization through manifold identification ⋮ Local linear convergence of proximal coordinate descent algorithm ⋮ Stochastic block-coordinate gradient projection algorithms for submodular maximization ⋮ Convergence of the augmented decomposition algorithm ⋮ On the complexity analysis of randomized block-coordinate descent methods ⋮ On the complexity of parallel coordinate descent ⋮ Inexact variable metric stochastic block-coordinate descent for regularized optimization ⋮ Second-order orthant-based methods with enriched Hessian information for sparse \(\ell _1\)-optimization ⋮ Sample size selection in optimization methods for machine learning ⋮ Regularized optimization with spatial coupling for robust decision making ⋮ A Fast Active Set Block Coordinate Descent Algorithm for $\ell_1$-Regularized Least Squares ⋮ Feature selection in SVM via polyhedral \(k\)-norm ⋮ ``Active-set complexity of proximal gradient: how long does it take to find the sparsity pattern? ⋮ First-order Methods for the Impatient: Support Identification in Finite Time with Convergent Frank--Wolfe Variants ⋮ An adaptive accelerated proximal gradient method and its homotopy continuation for sparse optimization ⋮ Proximal Gradient Methods with Adaptive Subspace Sampling ⋮ A second-order method for strongly convex \(\ell _1\)-regularization problems
Uses Software
This page was built for publication: Accelerated Block-coordinate Relaxation for Regularized Optimization