Iteration complexity analysis of block coordinate descent methods

From MaRDI portal
Publication:526831


DOI10.1007/s10107-016-1057-8zbMath1367.49010arXiv1310.6957MaRDI QIDQ526831

Meisam Razaviyayn, Zhi-Quan Luo, Mingyi Hong, Xiang-Feng Wang

Publication date: 15 May 2017

Published in: Mathematical Programming. Series A. Series B (Search for Journal in Brave)

Full work available at URL: https://arxiv.org/abs/1310.6957


90C25: Convex programming

90C56: Derivative-free methods and methods using generalized derivatives

49J52: Nonsmooth analysis


Related Items

A Randomized Nonmonotone Block Proximal Gradient Method for a Class of Structured Nonlinear Programming, Hybrid Jacobian and Gauss--Seidel Proximal Block Coordinate Update Methods for Linearly Constrained Convex Programming, A Coordinate-Descent Primal-Dual Algorithm with Large Step Size and Possibly Nonseparable Functions, On Faster Convergence of Cyclic Block Coordinate Descent-type Methods for Strongly Convex Minimization, Unnamed Item, CoordinateWise Descent Methods for Leading Eigenvalue Problem, Computing the Best Approximation over the Intersection of a Polyhedral Set and the Doubly Nonnegative Cone, Cyclic Coordinate-Update Algorithms for Fixed-Point Problems: Analysis and Applications, Randomized Block Proximal Damped Newton Method for Composite Self-Concordant Minimization, Iteration Complexity of a Block Coordinate Gradient Descent Method for Convex Optimization, Bregman Finito/MISO for Nonconvex Regularized Finite Sum Minimization without Lipschitz Gradient Continuity, Cyclic Coordinate Dual Averaging with Extrapolation, Local linear convergence of proximal coordinate descent algorithm, On the linear convergence of the approximate proximal splitting method for non-smooth convex optimization, On the complexity analysis of randomized block-coordinate descent methods, A globally convergent algorithm for nonconvex optimization based on block coordinate update, Extended ADMM and BCD for nonseparable convex minimization models with quadratic coupling terms: convergence analysis and insights, Asynchronous parallel primal-dual block coordinate update methods for affinely constrained convex programs, Variational analysis perspective on linear convergence of some first order methods for nonsmooth convex optimization problems, Sparse regression at scale: branch-and-bound rooted in first-order optimization, Block-coordinate and incremental aggregated proximal gradient methods for nonsmooth nonconvex problems, An accelerated coordinate gradient descent algorithm for non-separable composite optimization, Synchronous parallel block coordinate descent method for nonsmooth convex function minimization, Worst-case complexity of cyclic coordinate descent: \(O(n^2)\) gap with randomized version, Alternating minimization methods for strongly convex optimization, New analysis of linear convergence of gradient-type methods via unifying error bound conditions, Dykstra's splitting and an approximate proximal point algorithm for minimizing the sum of convex functions, On the convergence of asynchronous parallel iteration with unbounded delays, A parallel line search subspace correction method for composite convex optimization, Accelerating Nonnegative Matrix Factorization Algorithms Using Extrapolation, A Block Successive Upper-Bound Minimization Method of Multipliers for Linearly Constrained Convex Optimization, An Accelerated Randomized Proximal Coordinate Gradient Method and its Application to Regularized Empirical Risk Minimization, Parallel Random Coordinate Descent Method for Composite Minimization: Convergence Analysis and Error Bounds


Uses Software


Cites Work