On the Nonasymptotic Convergence of Cyclic Coordinate Descent Methods
From MaRDI portal
Publication:5300539
DOI10.1137/110840054zbMath1270.90032OpenAlexW1974524469MaRDI QIDQ5300539
Publication date: 27 June 2013
Published in: SIAM Journal on Optimization (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1137/110840054
Related Items (31)
Parallel block coordinate minimization with application to group regularized regression ⋮ The Cyclic Block Conditional Gradient Method for Convex Optimization Problems ⋮ An Accelerated Randomized Proximal Coordinate Gradient Method and its Application to Regularized Empirical Risk Minimization ⋮ Inexact coordinate descent: complexity and preconditioning ⋮ Distributed Block Coordinate Descent for Minimizing Partially Separable Functions ⋮ Convergence of incentive-driven dynamics in Fisher markets ⋮ A globally convergent algorithm for nonconvex optimization based on block coordinate update ⋮ Block Stochastic Gradient Iteration for Convex and Nonconvex Optimization ⋮ Block-cyclic stochastic coordinate descent for deep neural networks ⋮ Empirical risk minimization: probabilistic complexity and stepsize strategy ⋮ Cyclic Coordinate Dual Averaging with Extrapolation ⋮ Randomness and permutations in coordinate descent methods ⋮ Asynchronous Stochastic Coordinate Descent: Parallelism and Convergence Properties ⋮ Analysis of the Block Coordinate Descent Method for Linear Ill-Posed Problems ⋮ On the Global Convergence of Randomized Coordinate Gradient Descent for Nonconvex Optimization ⋮ Local linear convergence of proximal coordinate descent algorithm ⋮ Dual block-coordinate forward-backward algorithm with application to deconvolution and deinterlacing of video sequences ⋮ A random coordinate descent algorithm for optimization problems with composite objective function and linear coupled constraints ⋮ Unnamed Item ⋮ Worst-case complexity of cyclic coordinate descent: \(O(n^2)\) gap with randomized version ⋮ Alternating minimization methods for strongly convex optimization ⋮ On the complexity analysis of randomized block-coordinate descent methods ⋮ On the complexity of parallel coordinate descent ⋮ ADDRESSING IMBALANCED INSURANCE DATA THROUGH ZERO-INFLATED POISSON REGRESSION WITH BOOSTING ⋮ On Faster Convergence of Cyclic Block Coordinate Descent-type Methods for Strongly Convex Minimization ⋮ Iteration complexity analysis of block coordinate descent methods ⋮ An Efficient Inexact ABCD Method for Least Squares Semidefinite Programming ⋮ A Block Successive Upper-Bound Minimization Method of Multipliers for Linearly Constrained Convex Optimization ⋮ A knowledge‐supported improvement of the PSO method ⋮ Iteration Complexity of a Block Coordinate Gradient Descent Method for Convex Optimization ⋮ Parallel coordinate descent methods for big data optimization
This page was built for publication: On the Nonasymptotic Convergence of Cyclic Coordinate Descent Methods