On the complexity analysis of randomized block-coordinate descent methods
From MaRDI portal
Abstract: In this paper we analyze the randomized block-coordinate descent (RBCD) methods proposed in [8,11] for minimizing the sum of a smooth convex function and a block-separable convex function. In particular, we extend Nesterov's technique developed in [8] for analyzing the RBCD method for minimizing a smooth convex function over a block-separable closed convex set to the aforementioned more general problem and obtain a sharper expected-value type of convergence rate than the one implied in [11]. Also, we obtain a better high-probability type of iteration complexity, which improves upon the one in [11] by at least the amount , where is the target solution accuracy and is the number of problem blocks. In addition, for unconstrained smooth convex minimization, we develop a new technique called {it randomized estimate sequence} to analyze the accelerated RBCD method proposed by Nesterov [11] and establish a sharper expected-value type of convergence rate than the one given in [11].
Recommendations
- Iteration complexity of randomized block-coordinate descent methods for minimizing a composite function
- Accelerated, parallel, and proximal coordinate descent
- Parallel random coordinate descent method for composite minimization: convergence analysis and error bounds
- On the convergence of block coordinate descent type methods
- A flexible coordinate descent method
Cites work
- A coordinate gradient descent method for \(\ell_{1}\)-regularized convex minimization
- A coordinate gradient descent method for nonsmooth separable minimization
- Accelerated block-coordinate relaxation for regularized optimization
- Accelerated proximal stochastic dual coordinate ascent for regularized loss minimization
- Block coordinate descent methods for semidefinite programming
- Block-coordinate gradient descent method for linearly constrained nonsmooth separable optimization
- Convergence of a block coordinate descent method for nondifferentiable minimization
- Coordinate descent algorithms for lasso penalized regression
- Coordinate descent method for large-scale L2-loss linear support vector machines
- Coordinate descent optimization for \(l^{1}\) minimization with application to compressed sensing; a greedy algorithm
- Efficiency of coordinate descent methods on huge-scale optimization problems
- Efficient block-coordinate descent algorithms for the group Lasso
- Efficient random coordinate descent algorithms for large-scale structured nonconvex optimization
- Efficient serial and parallel coordinate descent methods for huge-scale truss topology design
- Gradient methods for minimizing composite functions
- Inexact coordinate descent: complexity and preconditioning
- Introductory lectures on convex optimization. A basic course.
- Iteration complexity analysis of block coordinate descent methods
- Iteration complexity of randomized block-coordinate descent methods for minimizing a composite function
- On the Nonasymptotic Convergence of Cyclic Coordinate Descent Methods
- On the convergence of block coordinate descent type methods
- On the convergence of the coordinate descent method for convex differentiable minimization
- Parallel coordinate descent methods for big data optimization
- Randomized methods for linear constraints: convergence rates and conditioning
Cited in
(81)- Worst-case complexity of cyclic coordinate descent: \(O(n^2)\) gap with randomized version
- Block coordinate type methods for optimization and learning
- On greedy randomized block Gauss-Seidel method with averaging for sparse linear least-squares problems
- Markov chain block coordinate descent
- On the complexity analysis of the primal solutions for the accelerated randomized dual coordinate ascent
- Fully asynchronous stochastic coordinate descent: a tight lower bound on the parallelism achieving linear speedup
- On the convergence of a randomized block coordinate descent algorithm for a matrix least squares problem
- Gauss-Seidel method with oblique direction
- Coordinate descent with arbitrary sampling. I: Algorithms and complexity.
- Stochastic mirror descent method for linear ill-posed problems in Banach spaces
- Fast and safe: accelerated gradient methods with optimality certificates and underestimate sequences
- An accelerated randomized proximal coordinate gradient method and its application to regularized empirical risk minimization
- Faster convergence of a randomized coordinate descent method for linearly constrained optimization problems
- Inexact coordinate descent: complexity and preconditioning
- Cyclic coordinate-update algorithms for fixed-point problems: analysis and applications
- Parallel random block-coordinate forward-backward algorithm: a unified convergence analysis
- On optimal probabilities in stochastic coordinate descent methods
- Stochastic block-coordinate gradient projection algorithms for submodular maximization
- On faster convergence of cyclic block coordinate descent-type methods for strongly convex minimization
- A fast block coordinate descent method for solving linear least-squares problems
- Inexact variable metric stochastic block-coordinate descent for regularized optimization
- On the complexity of parallel coordinate descent
- Iterative positive thresholding algorithm for non-negative sparse optimization
- Linear Convergence of Random Dual Coordinate Descent on Nonpolyhedral Convex Problems
- Generalized stochastic Frank-Wolfe algorithm with stochastic ``substitute gradient for structured convex optimization
- Iteration complexity of randomized block-coordinate descent methods for minimizing a composite function
- Fastest rates for stochastic mirror descent methods
- Accelerated, parallel, and proximal coordinate descent
- On the convergence of a block-coordinate incremental gradient method
- Misspecified nonconvex statistical optimization for sparse phase retrieval
- Parallel random coordinate descent method for composite minimization: convergence analysis and error bounds
- Random Coordinate Descent Methods for Nonseparable Composite Optimization
- On relaxed greedy randomized coordinate descent methods for solving large linear least-squares problems
- On Adaptive Sketch-and-Project for Solving Linear Systems
- Asynchronous stochastic coordinate descent: parallelism and convergence properties
- A globally convergent algorithm for nonconvex optimization based on block coordinate update
- A parallel line search subspace correction method for composite convex optimization
- Iteration complexity of a block coordinate gradient descent method for convex optimization
- A generic coordinate descent solver for non-smooth convex optimisation
- Randomized block proximal damped Newton method for composite self-concordant minimization
- On the convergence of block coordinate descent type methods
- Katyusha: the first direct acceleration of stochastic gradient methods
- Distributed block coordinate descent for minimizing partially separable functions
- Greedy randomized and maximal weighted residual Kaczmarz methods with oblique projection
- Random block coordinate descent methods for linearly constrained optimization over networks
- Decomposable norm minimization with proximal-gradient homotopy algorithm
- Stochastic quasi-Fejér block-coordinate fixed point iterations with random sweeping
- Pathwise coordinate optimization for sparse learning: algorithm and theory
- A random block-coordinate Douglas-Rachford splitting method with low computational complexity for binary logistic regression
- Iteration complexity analysis of block coordinate descent methods
- Randomized primal-dual proximal block coordinate updates
- A block symmetric Gauss-Seidel decomposition theorem for convex composite quadratic programming and its applications
- scientific article; zbMATH DE number 7306860 (Why is no real title available?)
- Randomized sparse block Kaczmarz as randomized dual block-coordinate descent
- A two-step randomized Gauss-Seidel method for solving large-scale linear least squares problems
- On the convergence of asynchronous parallel iteration with unbounded delays
- A flexible coordinate descent method
- Linear convergence of randomized feasible descent methods under the weak strong convexity assumption
- Accelerated primal-dual proximal block coordinate updating methods for constrained convex optimization
- Optimization in high dimensions via accelerated, parallel, and proximal coordinate descent
- Coordinate descent with arbitrary sampling. II: Expected separable overapproximation.
- Block stochastic gradient iteration for convex and nonconvex optimization
- An efficient nonmonotone projected Barzilai-Borwein method for nonnegative matrix factorization with extrapolation
- Extended ADMM and BCD for nonseparable convex minimization models with quadratic coupling terms: convergence analysis and insights
- A class of parallel doubly stochastic algorithms for large-scale learning
- Convergence rate of random scan coordinate ascent variational inference under log-concavity
- Adaptive coordinate sampling for stochastic primal–dual optimization
- Block Policy Mirror Descent
- On multi-step greedy randomized coordinate descent method for solving large linear least-squares problems
- Coordinate descent methods beyond smoothness and separability
- Statistical computational learning
- Convergence of an asynchronous block-coordinate forward-backward algorithm for convex composite optimization
- A randomized nonmonotone block proximal gradient method for a class of structured nonlinear programming
- Two symmetrized coordinate descent methods can be \(O(n^2)\) times slower than the randomized version
- scientific article; zbMATH DE number 3905760 (Why is no real title available?)
- Adaptive proximal SGD based on new estimating sequences for sparser ERM
- Randomized methods for computing optimal transport without regularization and their convergence analysis
- Efficient Global Optimization of Two-Layer ReLU Networks: Quadratic-Time Algorithms and Adversarial Training
- scientific article; zbMATH DE number 7409363 (Why is no real title available?)
- Computing the best approximation over the intersection of a polyhedral set and the doubly nonnegative cone
- Hybrid greedy randomized coordinate descent method for solving large-scale linear least square problem
This page was built for publication: On the complexity analysis of randomized block-coordinate descent methods
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q494345)