Iteration complexity analysis of block coordinate descent methods
From MaRDI portal
Publication:526831
DOI10.1007/s10107-016-1057-8zbMath1367.49010arXiv1310.6957OpenAlexW1499137793MaRDI QIDQ526831
Meisam Razaviyayn, Zhi-Quan Luo, Mingyi Hong, Xiang-Feng Wang
Publication date: 15 May 2017
Published in: Mathematical Programming. Series A. Series B (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1310.6957
Convex programming (90C25) Derivative-free methods and methods using generalized derivatives (90C56) Nonsmooth analysis (49J52)
Related Items
Block-coordinate and incremental aggregated proximal gradient methods for nonsmooth nonconvex problems, An accelerated coordinate gradient descent algorithm for non-separable composite optimization, An Accelerated Randomized Proximal Coordinate Gradient Method and its Application to Regularized Empirical Risk Minimization, Parallel Random Coordinate Descent Method for Composite Minimization: Convergence Analysis and Error Bounds, A globally convergent algorithm for nonconvex optimization based on block coordinate update, Cyclic Coordinate-Update Algorithms for Fixed-Point Problems: Analysis and Applications, Cyclic Coordinate Dual Averaging with Extrapolation, On the convergence of asynchronous parallel iteration with unbounded delays, Randomized Block Proximal Damped Newton Method for Composite Self-Concordant Minimization, A Randomized Nonmonotone Block Proximal Gradient Method for a Class of Structured Nonlinear Programming, Local linear convergence of proximal coordinate descent algorithm, Synchronous parallel block coordinate descent method for nonsmooth convex function minimization, Hybrid Jacobian and Gauss--Seidel Proximal Block Coordinate Update Methods for Linearly Constrained Convex Programming, Extended ADMM and BCD for nonseparable convex minimization models with quadratic coupling terms: convergence analysis and insights, Worst-case complexity of cyclic coordinate descent: \(O(n^2)\) gap with randomized version, Asynchronous parallel primal-dual block coordinate update methods for affinely constrained convex programs, On the linear convergence of the approximate proximal splitting method for non-smooth convex optimization, Alternating minimization methods for strongly convex optimization, On the complexity analysis of randomized block-coordinate descent methods, A Coordinate-Descent Primal-Dual Algorithm with Large Step Size and Possibly Nonseparable Functions, On Faster Convergence of Cyclic Block Coordinate Descent-type Methods for Strongly Convex Minimization, A parallel line search subspace correction method for composite convex optimization, New analysis of linear convergence of gradient-type methods via unifying error bound conditions, Accelerating Nonnegative Matrix Factorization Algorithms Using Extrapolation, A Block Successive Upper-Bound Minimization Method of Multipliers for Linearly Constrained Convex Optimization, CoordinateWise Descent Methods for Leading Eigenvalue Problem, Variational analysis perspective on linear convergence of some first order methods for nonsmooth convex optimization problems, Dykstra's splitting and an approximate proximal point algorithm for minimizing the sum of convex functions, Computing the Best Approximation over the Intersection of a Polyhedral Set and the Doubly Nonnegative Cone, Unnamed Item, Iteration Complexity of a Block Coordinate Gradient Descent Method for Convex Optimization, Sparse regression at scale: branch-and-bound rooted in first-order optimization, Bregman Finito/MISO for Nonconvex Regularized Finite Sum Minimization without Lipschitz Gradient Continuity
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- On the linear convergence of a proximal gradient method for a class of nonsmooth convex minimization problems
- On the complexity analysis of randomized block-coordinate descent methods
- Approximation accuracy, gradient methods, and error bound for structured convex optimization
- A unified primal-dual algorithm framework based on Bregman iteration
- A coordinate gradient descent method for nonsmooth separable minimization
- Block-coordinate gradient descent method for linearly constrained nonsmooth separable optimization
- Error bounds and convergence analysis of feasible descent methods: A general approach
- On the convergence of the coordinate descent method for convex differentiable minimization
- Introductory lectures on convex optimization. A basic course.
- A new inexact alternating directions method for monotone variational inequalities
- On the convergence of the block nonlinear Gauss-Seidel method under convex constraints
- Iteration complexity of randomized block-coordinate descent methods for minimizing a composite function
- A Unified Convergence Analysis of Block Successive Minimization Methods for Nonsmooth Optimization
- Maximum Block Improvement and Polynomial Optimization
- Efficiency of Coordinate Descent Methods on Huge-Scale Optimization Problems
- On the Convergence of Alternating Minimization for Convex Programming with Applications to Iteratively Reweighted Least Squares and Decomposition Schemes
- Iterative Water-Filling for Gaussian Vector Multiple-Access Channels
- Iteratively reweighted least squares minimization for sparse recovery
- On the Linear Convergence of Descent Methods for Convex Essentially Smooth Minimization
- On the Convergence Rate of Dual Ascent Methods for Linearly Constrained Convex Minimization
- Decomposition by Partial Linearization: Parallel Optimization of Multi-Agent Systems
- The Linearized Alternating Direction Method of Multipliers for Dantzig Selector
- On the Nonasymptotic Convergence of Cyclic Coordinate Descent Methods
- On the Convergence of Block Coordinate Descent Type Methods
- Convergence of a block coordinate descent method for nondifferentiable minimization