Smooth minimization of nonsmooth functions with parallel coordinate descent methods
From MaRDI portal
Publication:2325237
DOI10.1007/978-3-030-12119-8_4zbMath1421.90061arXiv1309.5885OpenAlexW2099119387MaRDI QIDQ2325237
Peter Richtárik, Olivier Fercoq
Publication date: 9 September 2019
Full work available at URL: https://arxiv.org/abs/1309.5885
Nonconvex programming, global optimization (90C26) Deterministic scheduling theory in operations research (90B35) Models and methods for concurrent and distributed computing (process algebras, bisimulation, transition nets, etc.) (68Q85)
Related Items (12)
Accelerated, Parallel, and Proximal Coordinate Descent ⋮ On optimal probabilities in stochastic coordinate descent methods ⋮ Distributed Block Coordinate Descent for Minimizing Partially Separable Functions ⋮ Asynchronous Stochastic Coordinate Descent: Parallelism and Convergence Properties ⋮ Stochastic Block Mirror Descent Methods for Nonsmooth and Stochastic Optimization ⋮ On the complexity of parallel coordinate descent ⋮ Asynchronous Lagrangian scenario decomposition ⋮ A parallel line search subspace correction method for composite convex optimization ⋮ Restarting the accelerated coordinate descent method with a rough strong convexity estimate ⋮ Optimization in High Dimensions via Accelerated, Parallel, and Proximal Coordinate Descent ⋮ A generic coordinate descent solver for non-smooth convex optimisation ⋮ Parallel coordinate descent methods for big data optimization
This page was built for publication: Smooth minimization of nonsmooth functions with parallel coordinate descent methods