Iteration complexity analysis of block coordinate descent methods

From MaRDI portal
Publication:526831

DOI10.1007/s10107-016-1057-8zbMath1367.49010arXiv1310.6957OpenAlexW1499137793MaRDI QIDQ526831

Meisam Razaviyayn, Zhi-Quan Luo, Mingyi Hong, Xiang-Feng Wang

Publication date: 15 May 2017

Published in: Mathematical Programming. Series A. Series B (Search for Journal in Brave)

Full work available at URL: https://arxiv.org/abs/1310.6957



Related Items

Block-coordinate and incremental aggregated proximal gradient methods for nonsmooth nonconvex problems, An accelerated coordinate gradient descent algorithm for non-separable composite optimization, An Accelerated Randomized Proximal Coordinate Gradient Method and its Application to Regularized Empirical Risk Minimization, Parallel Random Coordinate Descent Method for Composite Minimization: Convergence Analysis and Error Bounds, A globally convergent algorithm for nonconvex optimization based on block coordinate update, Cyclic Coordinate-Update Algorithms for Fixed-Point Problems: Analysis and Applications, Cyclic Coordinate Dual Averaging with Extrapolation, On the convergence of asynchronous parallel iteration with unbounded delays, Randomized Block Proximal Damped Newton Method for Composite Self-Concordant Minimization, A Randomized Nonmonotone Block Proximal Gradient Method for a Class of Structured Nonlinear Programming, Local linear convergence of proximal coordinate descent algorithm, Synchronous parallel block coordinate descent method for nonsmooth convex function minimization, Hybrid Jacobian and Gauss--Seidel Proximal Block Coordinate Update Methods for Linearly Constrained Convex Programming, Extended ADMM and BCD for nonseparable convex minimization models with quadratic coupling terms: convergence analysis and insights, Worst-case complexity of cyclic coordinate descent: \(O(n^2)\) gap with randomized version, Asynchronous parallel primal-dual block coordinate update methods for affinely constrained convex programs, On the linear convergence of the approximate proximal splitting method for non-smooth convex optimization, Alternating minimization methods for strongly convex optimization, On the complexity analysis of randomized block-coordinate descent methods, A Coordinate-Descent Primal-Dual Algorithm with Large Step Size and Possibly Nonseparable Functions, On Faster Convergence of Cyclic Block Coordinate Descent-type Methods for Strongly Convex Minimization, A parallel line search subspace correction method for composite convex optimization, New analysis of linear convergence of gradient-type methods via unifying error bound conditions, Accelerating Nonnegative Matrix Factorization Algorithms Using Extrapolation, A Block Successive Upper-Bound Minimization Method of Multipliers for Linearly Constrained Convex Optimization, CoordinateWise Descent Methods for Leading Eigenvalue Problem, Variational analysis perspective on linear convergence of some first order methods for nonsmooth convex optimization problems, Dykstra's splitting and an approximate proximal point algorithm for minimizing the sum of convex functions, Computing the Best Approximation over the Intersection of a Polyhedral Set and the Doubly Nonnegative Cone, Unnamed Item, Iteration Complexity of a Block Coordinate Gradient Descent Method for Convex Optimization, Sparse regression at scale: branch-and-bound rooted in first-order optimization, Bregman Finito/MISO for Nonconvex Regularized Finite Sum Minimization without Lipschitz Gradient Continuity


Uses Software


Cites Work