On the complexity analysis of randomized block-coordinate descent methods
From MaRDI portal
Publication:494345
DOI10.1007/s10107-014-0800-2zbMath1321.65100arXiv1305.4723OpenAlexW2075660001MaRDI QIDQ494345
Publication date: 31 August 2015
Published in: Mathematical Programming. Series A. Series B (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1305.4723
convergence rateiteration complexityaccelerated coordinate descentcomposite minimizationrandomized block-coordinate descent
Numerical mathematical programming methods (65K05) Convex programming (90C25) Large-scale problems in mathematical programming (90C06) Linear programming (90C05)
Related Items
An efficient nonmonotone projected Barzilai–Borwein method for nonnegative matrix factorization with extrapolation, A two-step randomized Gauss-Seidel method for solving large-scale linear least squares problems, Greedy randomized and maximal weighted residual Kaczmarz methods with oblique projection, Block coordinate type methods for optimization and learning, Parallel random block-coordinate forward-backward algorithm: a unified convergence analysis, Accelerated, Parallel, and Proximal Coordinate Descent, On the Complexity Analysis of the Primal Solutions for the Accelerated Randomized Dual Coordinate Ascent, An Accelerated Randomized Proximal Coordinate Gradient Method and its Application to Regularized Empirical Risk Minimization, A flexible coordinate descent method, On optimal probabilities in stochastic coordinate descent methods, A Fast Block Coordinate Descent Method for Solving Linear Least-Squares Problems, Distributed Block Coordinate Descent for Minimizing Partially Separable Functions, Parallel Random Coordinate Descent Method for Composite Minimization: Convergence Analysis and Error Bounds, Iterative positive thresholding algorithm for non-negative sparse optimization, Unnamed Item, A globally convergent algorithm for nonconvex optimization based on block coordinate update, Block Stochastic Gradient Iteration for Convex and Nonconvex Optimization, A block symmetric Gauss-Seidel decomposition theorem for convex composite quadratic programming and its applications, Cyclic Coordinate-Update Algorithms for Fixed-Point Problems: Analysis and Applications, A random block-coordinate Douglas-Rachford splitting method with low computational complexity for binary logistic regression, Asynchronous Stochastic Coordinate Descent: Parallelism and Convergence Properties, Adaptive coordinate sampling for stochastic primal–dual optimization, Block Policy Mirror Descent, On the convergence of asynchronous parallel iteration with unbounded delays, Stochastic mirror descent method for linear ill-posed problems in Banach spaces, On relaxed greedy randomized coordinate descent methods for solving large linear least-squares problems, Randomized Block Proximal Damped Newton Method for Composite Self-Concordant Minimization, A Randomized Nonmonotone Block Proximal Gradient Method for a Class of Structured Nonlinear Programming, Misspecified nonconvex statistical optimization for sparse phase retrieval, On greedy randomized block Gauss-Seidel method with averaging for sparse linear least-squares problems, On multi-step greedy randomized coordinate descent method for solving large linear least-squares problems, Efficient Global Optimization of Two-Layer ReLU Networks: Quadratic-Time Algorithms and Adversarial Training, Random Coordinate Descent Methods for Nonseparable Composite Optimization, Adaptive proximal SGD based on new estimating sequences for sparser ERM, Faster convergence of a randomized coordinate descent method for linearly constrained optimization problems, Extended ADMM and BCD for nonseparable convex minimization models with quadratic coupling terms: convergence analysis and insights, Stochastic block-coordinate gradient projection algorithms for submodular maximization, Worst-case complexity of cyclic coordinate descent: \(O(n^2)\) gap with randomized version, Fully asynchronous stochastic coordinate descent: a tight lower bound on the parallelism achieving linear speedup, On the complexity of parallel coordinate descent, Inexact variable metric stochastic block-coordinate descent for regularized optimization, Pathwise coordinate optimization for sparse learning: algorithm and theory, Decomposable norm minimization with proximal-gradient homotopy algorithm, Two Symmetrized Coordinate Descent Methods Can Be $O(n^2)$ Times Slower Than the Randomized Version, Accelerated primal-dual proximal block coordinate updating methods for constrained convex optimization, On Faster Convergence of Cyclic Block Coordinate Descent-type Methods for Strongly Convex Minimization, Iteration complexity analysis of block coordinate descent methods, Generalized stochastic Frank-Wolfe algorithm with stochastic ``substitute gradient for structured convex optimization, Unnamed Item, A parallel line search subspace correction method for composite convex optimization, Fast and safe: accelerated gradient methods with optimality certificates and underestimate sequences, On the convergence of a randomized block coordinate descent algorithm for a matrix least squares problem, Gauss-Seidel method with oblique direction, Randomized primal-dual proximal block coordinate updates, Optimization in High Dimensions via Accelerated, Parallel, and Proximal Coordinate Descent, Computing the Best Approximation over the Intersection of a Polyhedral Set and the Doubly Nonnegative Cone, On Adaptive Sketch-and-Project for Solving Linear Systems, Unnamed Item, Stochastic Quasi-Fejér Block-Coordinate Fixed Point Iterations with Random Sweeping, Iteration Complexity of a Block Coordinate Gradient Descent Method for Convex Optimization, On the convergence of a block-coordinate incremental gradient method, A generic coordinate descent solver for non-smooth convex optimisation, Linear Convergence of Random Dual Coordinate Descent on Nonpolyhedral Convex Problems
Cites Work
- Unnamed Item
- Parallel coordinate descent methods for big data optimization
- Inexact coordinate descent: complexity and preconditioning
- Gradient methods for minimizing composite functions
- Efficient random coordinate descent algorithms for large-scale structured nonconvex optimization
- Iteration complexity analysis of block coordinate descent methods
- A coordinate gradient descent method for \(\ell_{1}\)-regularized convex minimization
- A coordinate gradient descent method for nonsmooth separable minimization
- Block-coordinate gradient descent method for linearly constrained nonsmooth separable optimization
- On the convergence of the coordinate descent method for convex differentiable minimization
- Introductory lectures on convex optimization. A basic course.
- Coordinate descent optimization for \(l^{1}\) minimization with application to compressed sensing; a greedy algorithm
- Efficient block-coordinate descent algorithms for the group Lasso
- Iteration complexity of randomized block-coordinate descent methods for minimizing a composite function
- Coordinate descent algorithms for lasso penalized regression
- Block Coordinate Descent Methods for Semidefinite Programming
- Accelerated Block-coordinate Relaxation for Regularized Optimization
- Efficiency of Coordinate Descent Methods on Huge-Scale Optimization Problems
- Randomized Methods for Linear Constraints: Convergence Rates and Conditioning
- Efficient Serial and Parallel Coordinate Descent Methods for Huge-Scale Truss Topology Design
- On the Nonasymptotic Convergence of Cyclic Coordinate Descent Methods
- On the Convergence of Block Coordinate Descent Type Methods
- Convergence of a block coordinate descent method for nondifferentiable minimization
- Accelerated proximal stochastic dual coordinate ascent for regularized loss minimization