On optimal probabilities in stochastic coordinate descent methods
From MaRDI portal
Publication:315487
DOI10.1007/s11590-015-0916-1zbMath1353.90148arXiv1310.3438OpenAlexW2164075197WikidataQ59474745 ScholiaQ59474745MaRDI QIDQ315487
Publication date: 21 September 2016
Published in: Optimization Letters (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1310.3438
Related Items (20)
Parallel random block-coordinate forward-backward algorithm: a unified convergence analysis ⋮ Accelerated, Parallel, and Proximal Coordinate Descent ⋮ Distributed Block Coordinate Descent for Minimizing Partially Separable Functions ⋮ Adaptive coordinate sampling for stochastic primal–dual optimization ⋮ Unnamed Item ⋮ Stochastic Reformulations of Linear Systems: Algorithms and Convergence Theory ⋮ Batched Stochastic Gradient Descent with Weighted Sampling ⋮ Faster convergence of a randomized coordinate descent method for linearly constrained optimization problems ⋮ Stochastic quasi-Fejér block-coordinate fixed point iterations with random sweeping. II: Mean-square and linear convergence ⋮ On the complexity of parallel coordinate descent ⋮ Unnamed Item ⋮ Stochastic Primal-Dual Hybrid Gradient Algorithm with Arbitrary Sampling and Imaging Applications ⋮ Distributed Learning with Sparse Communications by Identification ⋮ Fastest rates for stochastic mirror descent methods ⋮ Coordinate descent with arbitrary sampling I: algorithms and complexity† ⋮ Optimization in High Dimensions via Accelerated, Parallel, and Proximal Coordinate Descent ⋮ High-performance statistical computing in the computing environments of the 2020s ⋮ A generic coordinate descent solver for non-smooth convex optimisation ⋮ Proximal Gradient Methods with Adaptive Subspace Sampling ⋮ Parallel coordinate descent methods for big data optimization
Cites Work
- Unnamed Item
- Parallel coordinate descent methods for big data optimization
- Inexact coordinate descent: complexity and preconditioning
- On the complexity analysis of randomized block-coordinate descent methods
- Smooth minimization of nonsmooth functions with parallel coordinate descent methods
- Iteration complexity of randomized block-coordinate descent methods for minimizing a composite function
- Distributed Coordinate Descent Method for Learning with Big Data
- Coordinate descent with arbitrary sampling II: expected separable overapproximation
- Efficiency of Coordinate Descent Methods on Huge-Scale Optimization Problems
- Separable approximations and decomposition methods for the augmented Lagrangian
- Stochastic Block Mirror Descent Methods for Nonsmooth and Stochastic Optimization
- Randomized Iterative Methods for Linear Systems
- Efficient Serial and Parallel Coordinate Descent Methods for Huge-Scale Truss Topology Design
- Random Coordinate Descent Algorithms for Multi-Agent Convex Optimization Over Networks
- Stochastic Dual Coordinate Ascent Methods for Regularized Loss Minimization
- Accelerated proximal stochastic dual coordinate ascent for regularized loss minimization
This page was built for publication: On optimal probabilities in stochastic coordinate descent methods