Parallel Random Coordinate Descent Method for Composite Minimization: Convergence Analysis and Error Bounds
From MaRDI portal
Publication:3465244
DOI10.1137/130950288zbMath1329.90108arXiv1312.5302OpenAlexW2592062427MaRDI QIDQ3465244
Publication date: 21 January 2016
Published in: SIAM Journal on Optimization (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1312.5302
rate of convergencepartially separable functionscomposite minimizationgeneralized error bound conditionparallel random coordinate descent algorithm
Numerical mathematical programming methods (65K05) Convex programming (90C25) Large-scale problems in mathematical programming (90C06) Optimality conditions and duality in mathematical programming (90C46) Decomposition methods (49M27)
Related Items
Parallel block coordinate minimization with application to group regularized regression, Convergence results of a new monotone inertial forward-backward splitting algorithm under the local Hölder error bound condition, Parallel random block-coordinate forward-backward algorithm: a unified convergence analysis, Accelerated, Parallel, and Proximal Coordinate Descent, An Accelerated Randomized Proximal Coordinate Gradient Method and its Application to Regularized Empirical Risk Minimization, Distributed Block Coordinate Descent for Minimizing Partially Separable Functions, Parallel Random Coordinate Descent Method for Composite Minimization: Convergence Analysis and Error Bounds, Random block coordinate descent methods for linearly constrained optimization over networks, Extended randomized Kaczmarz method for sparse least squares and impulsive noise problems, Linear convergence of first order methods for non-strongly convex optimization, Randomized Block Adaptive Linear System Solvers, Random Coordinate Descent Methods for Nonseparable Composite Optimization, Synchronous parallel block coordinate descent method for nonsmooth convex function minimization, Faster convergence of a randomized coordinate descent method for linearly constrained optimization problems, A Randomized Coordinate Descent Method with Volume Sampling, On the linear convergence of the approximate proximal splitting method for non-smooth convex optimization, On the complexity of parallel coordinate descent, Global Convergence Rate of Proximal Incremental Aggregated Gradient Methods, RSG: Beating Subgradient Method without Smoothness and Strong Convexity, Faster Randomized Block Kaczmarz Algorithms, Optimization in High Dimensions via Accelerated, Parallel, and Proximal Coordinate Descent, Variational analysis perspective on linear convergence of some first order methods for nonsmooth convex optimization problems, Coordinate descent algorithms, Linear Convergence of Random Dual Coordinate Descent on Nonpolyhedral Convex Problems, Linear convergence of prox-SVRG method for separable non-smooth convex optimization problems under bounded metric subregularity, Proximal Gradient Methods for Machine Learning and Imaging, Parallel coordinate descent methods for big data optimization
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Parallel coordinate descent methods for big data optimization
- Gradient methods for minimizing composite functions
- A random coordinate descent algorithm for optimization problems with composite objective function and linear coupled constraints
- On the complexity analysis of randomized block-coordinate descent methods
- Iteration complexity analysis of block coordinate descent methods
- Interior-point Lagrangian decomposition method for separable convex optimization
- A coordinate gradient descent method for nonsmooth separable minimization
- Error bounds and convergence analysis of feasible descent methods: A general approach
- Introductory lectures on convex optimization. A basic course.
- Bounds for error in the solution set of a perturbed linear program
- Random block coordinate descent methods for linearly constrained optimization over networks
- Iteration complexity of randomized block-coordinate descent methods for minimizing a composite function
- Distributed Coordinate Descent Method for Learning with Big Data
- Efficiency of Coordinate Descent Methods on Huge-Scale Optimization Problems
- Rate Analysis of Inexact Dual First-Order Methods Application to Dual Decomposition
- Randomized Methods for Linear Constraints: Convergence Rates and Conditioning
- Parallel Random Coordinate Descent Method for Composite Minimization: Convergence Analysis and Error Bounds
- Incremental Stochastic Subgradient Algorithms for Convex Optimization
- Variational Analysis
- Random Coordinate Descent Algorithms for Multi-Agent Convex Optimization Over Networks
- Non-Lipschitz $\ell_{p}$-Regularization and Box Constrained Model for Image Restoration
- On the Convergence of Block Coordinate Descent Type Methods