On the complexity analysis of randomized block-coordinate descent methods
DOI10.1007/S10107-014-0800-2zbMATH Open1321.65100arXiv1305.4723OpenAlexW2075660001MaRDI QIDQ494345FDOQ494345
Authors: Zhaosong Lu, Lin Xiao
Publication date: 31 August 2015
Published in: Mathematical Programming. Series A. Series B (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1305.4723
Recommendations
- Iteration complexity of randomized block-coordinate descent methods for minimizing a composite function
- Accelerated, parallel, and proximal coordinate descent
- Parallel random coordinate descent method for composite minimization: convergence analysis and error bounds
- On the convergence of block coordinate descent type methods
- A flexible coordinate descent method
convergence rateiteration complexityaccelerated coordinate descentcomposite minimizationrandomized block-coordinate descent
Numerical mathematical programming methods (65K05) Convex programming (90C25) Linear programming (90C05) Large-scale problems in mathematical programming (90C06)
Cites Work
- Coordinate descent algorithms for lasso penalized regression
- Convergence of a block coordinate descent method for nondifferentiable minimization
- Introductory lectures on convex optimization. A basic course.
- Gradient methods for minimizing composite functions
- A coordinate gradient descent method for nonsmooth separable minimization
- Block-coordinate gradient descent method for linearly constrained nonsmooth separable optimization
- Coordinate descent optimization for \(l^{1}\) minimization with application to compressed sensing; a greedy algorithm
- Iteration complexity of randomized block-coordinate descent methods for minimizing a composite function
- Parallel coordinate descent methods for big data optimization
- Accelerated block-coordinate relaxation for regularized optimization
- Efficiency of coordinate descent methods on huge-scale optimization problems
- Inexact coordinate descent: complexity and preconditioning
- Coordinate descent method for large-scale L2-loss linear support vector machines
- Randomized methods for linear constraints: convergence rates and conditioning
- Efficient serial and parallel coordinate descent methods for huge-scale truss topology design
- On the Nonasymptotic Convergence of Cyclic Coordinate Descent Methods
- On the convergence of block coordinate descent type methods
- Efficient block-coordinate descent algorithms for the group Lasso
- Efficient random coordinate descent algorithms for large-scale structured nonconvex optimization
- A coordinate gradient descent method for \(\ell_{1}\)-regularized convex minimization
- Accelerated proximal stochastic dual coordinate ascent for regularized loss minimization
- Block coordinate descent methods for semidefinite programming
- On the convergence of the coordinate descent method for convex differentiable minimization
- Iteration complexity analysis of block coordinate descent methods
Cited In (81)
- Inexact variable metric stochastic block-coordinate descent for regularized optimization
- Stochastic block-coordinate gradient projection algorithms for submodular maximization
- Linear Convergence of Random Dual Coordinate Descent on Nonpolyhedral Convex Problems
- On the convergence of a block-coordinate incremental gradient method
- On relaxed greedy randomized coordinate descent methods for solving large linear least-squares problems
- Inexact coordinate descent: complexity and preconditioning
- On optimal probabilities in stochastic coordinate descent methods
- A parallel line search subspace correction method for composite convex optimization
- A generic coordinate descent solver for non-smooth convex optimisation
- An efficient nonmonotone projected Barzilai-Borwein method for nonnegative matrix factorization with extrapolation
- On faster convergence of cyclic block coordinate descent-type methods for strongly convex minimization
- A fast block coordinate descent method for solving linear least-squares problems
- On Adaptive Sketch-and-Project for Solving Linear Systems
- Extended ADMM and BCD for nonseparable convex minimization models with quadratic coupling terms: convergence analysis and insights
- Pathwise coordinate optimization for sparse learning: algorithm and theory
- Fully asynchronous stochastic coordinate descent: a tight lower bound on the parallelism achieving linear speedup
- Coordinate descent with arbitrary sampling. I: Algorithms and complexity.
- Randomized sparse block Kaczmarz as randomized dual block-coordinate descent
- An accelerated randomized proximal coordinate gradient method and its application to regularized empirical risk minimization
- Fastest rates for stochastic mirror descent methods
- Asynchronous stochastic coordinate descent: parallelism and convergence properties
- Greedy randomized and maximal weighted residual Kaczmarz methods with oblique projection
- Linear convergence of randomized feasible descent methods under the weak strong convexity assumption
- Accelerated primal-dual proximal block coordinate updating methods for constrained convex optimization
- On greedy randomized block Gauss-Seidel method with averaging for sparse linear least-squares problems
- Stochastic mirror descent method for linear ill-posed problems in Banach spaces
- Generalized stochastic Frank-Wolfe algorithm with stochastic ``substitute gradient for structured convex optimization
- Randomized primal-dual proximal block coordinate updates
- Block stochastic gradient iteration for convex and nonconvex optimization
- Cyclic coordinate-update algorithms for fixed-point problems: analysis and applications
- A flexible coordinate descent method
- Accelerated, parallel, and proximal coordinate descent
- Parallel random coordinate descent method for composite minimization: convergence analysis and error bounds
- Parallel random block-coordinate forward-backward algorithm: a unified convergence analysis
- On the convergence of asynchronous parallel iteration with unbounded delays
- Iteration complexity of a block coordinate gradient descent method for convex optimization
- Stochastic quasi-Fejér block-coordinate fixed point iterations with random sweeping
- Block coordinate type methods for optimization and learning
- A random block-coordinate Douglas-Rachford splitting method with low computational complexity for binary logistic regression
- Fast and safe: accelerated gradient methods with optimality certificates and underestimate sequences
- On the convergence of block coordinate descent type methods
- A two-step randomized Gauss-Seidel method for solving large-scale linear least squares problems
- Iteration complexity of randomized block-coordinate descent methods for minimizing a composite function
- A globally convergent algorithm for nonconvex optimization based on block coordinate update
- Misspecified nonconvex statistical optimization for sparse phase retrieval
- Optimization in high dimensions via accelerated, parallel, and proximal coordinate descent
- On the complexity analysis of the primal solutions for the accelerated randomized dual coordinate ascent
- Faster convergence of a randomized coordinate descent method for linearly constrained optimization problems
- On the convergence of a randomized block coordinate descent algorithm for a matrix least squares problem
- Gauss-Seidel method with oblique direction
- Katyusha: the first direct acceleration of stochastic gradient methods
- Iteration complexity analysis of block coordinate descent methods
- Coordinate descent with arbitrary sampling. II: Expected separable overapproximation.
- Random Coordinate Descent Methods for Nonseparable Composite Optimization
- Iterative positive thresholding algorithm for non-negative sparse optimization
- A block symmetric Gauss-Seidel decomposition theorem for convex composite quadratic programming and its applications
- Markov chain block coordinate descent
- Worst-case complexity of cyclic coordinate descent: \(O(n^2)\) gap with randomized version
- Random block coordinate descent methods for linearly constrained optimization over networks
- Decomposable norm minimization with proximal-gradient homotopy algorithm
- On the complexity of parallel coordinate descent
- Distributed block coordinate descent for minimizing partially separable functions
- Randomized block proximal damped Newton method for composite self-concordant minimization
- Title not available (Why is that?)
- A class of parallel doubly stochastic algorithms for large-scale learning
- Randomized methods for computing optimal transport without regularization and their convergence analysis
- Efficient Global Optimization of Two-Layer ReLU Networks: Quadratic-Time Algorithms and Adversarial Training
- Computing the best approximation over the intersection of a polyhedral set and the doubly nonnegative cone
- Statistical computational learning
- Title not available (Why is that?)
- A randomized nonmonotone block proximal gradient method for a class of structured nonlinear programming
- Adaptive proximal SGD based on new estimating sequences for sparser ERM
- Title not available (Why is that?)
- Adaptive coordinate sampling for stochastic primal–dual optimization
- Block Policy Mirror Descent
- Convergence rate of random scan coordinate ascent variational inference under log-concavity
- Coordinate descent methods beyond smoothness and separability
- On multi-step greedy randomized coordinate descent method for solving large linear least-squares problems
- Hybrid greedy randomized coordinate descent method for solving large-scale linear least square problem
- Convergence of an asynchronous block-coordinate forward-backward algorithm for convex composite optimization
- Two symmetrized coordinate descent methods can be \(O(n^2)\) times slower than the randomized version
This page was built for publication: On the complexity analysis of randomized block-coordinate descent methods
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q494345)