Parallel Random Coordinate Descent Method for Composite Minimization: Convergence Analysis and Error Bounds

From MaRDI portal
Publication:3465244


DOI10.1137/130950288zbMath1329.90108arXiv1312.5302OpenAlexW2592062427MaRDI QIDQ3465244

Dragos Clipici, Ion Necoara

Publication date: 21 January 2016

Published in: SIAM Journal on Optimization (Search for Journal in Brave)

Full work available at URL: https://arxiv.org/abs/1312.5302



Related Items

Parallel block coordinate minimization with application to group regularized regression, Convergence results of a new monotone inertial forward-backward splitting algorithm under the local Hölder error bound condition, Parallel random block-coordinate forward-backward algorithm: a unified convergence analysis, Accelerated, Parallel, and Proximal Coordinate Descent, An Accelerated Randomized Proximal Coordinate Gradient Method and its Application to Regularized Empirical Risk Minimization, Distributed Block Coordinate Descent for Minimizing Partially Separable Functions, Parallel Random Coordinate Descent Method for Composite Minimization: Convergence Analysis and Error Bounds, Random block coordinate descent methods for linearly constrained optimization over networks, Extended randomized Kaczmarz method for sparse least squares and impulsive noise problems, Linear convergence of first order methods for non-strongly convex optimization, Randomized Block Adaptive Linear System Solvers, Random Coordinate Descent Methods for Nonseparable Composite Optimization, Synchronous parallel block coordinate descent method for nonsmooth convex function minimization, Faster convergence of a randomized coordinate descent method for linearly constrained optimization problems, A Randomized Coordinate Descent Method with Volume Sampling, On the linear convergence of the approximate proximal splitting method for non-smooth convex optimization, On the complexity of parallel coordinate descent, Global Convergence Rate of Proximal Incremental Aggregated Gradient Methods, RSG: Beating Subgradient Method without Smoothness and Strong Convexity, Faster Randomized Block Kaczmarz Algorithms, Optimization in High Dimensions via Accelerated, Parallel, and Proximal Coordinate Descent, Variational analysis perspective on linear convergence of some first order methods for nonsmooth convex optimization problems, Coordinate descent algorithms, Linear Convergence of Random Dual Coordinate Descent on Nonpolyhedral Convex Problems, Linear convergence of prox-SVRG method for separable non-smooth convex optimization problems under bounded metric subregularity, Proximal Gradient Methods for Machine Learning and Imaging, Parallel coordinate descent methods for big data optimization


Uses Software


Cites Work