Pages that link to "Item:Q2954387"
From MaRDI portal
The following pages link to Asynchronous Stochastic Coordinate Descent: Parallelism and Convergence Properties (Q2954387):
Displaying 24 items.
- The restricted strong convexity revisited: analysis of equivalence to error bound and quadratic growth (Q523179) (← links)
- On unbounded delays in asynchronous parallel fixed-point algorithms (Q1668724) (← links)
- Stochastic block-coordinate gradient projection algorithms for submodular maximization (Q1723100) (← links)
- Asynchronous parallel primal-dual block coordinate update methods for affinely constrained convex programs (Q1734768) (← links)
- Synchronous parallel block coordinate descent method for nonsmooth convex function minimization (Q2200102) (← links)
- Asynchronous parallel algorithms for nonconvex optimization (Q2205974) (← links)
- Fully asynchronous stochastic coordinate descent: a tight lower bound on the parallelism achieving linear speedup (Q2235160) (← links)
- New analysis of linear convergence of gradient-type methods via unifying error bound conditions (Q2297652) (← links)
- Markov chain block coordinate descent (Q2301127) (← links)
- An inertial parallel and asynchronous forward-backward iteration for distributed convex optimization (Q2322371) (← links)
- Coordinate descent algorithms (Q2349114) (← links)
- Parallel block coordinate minimization with application to group regularized regression (Q2358088) (← links)
- Random block coordinate descent methods for linearly constrained optimization over networks (Q2401516) (← links)
- On the convergence of asynchronous parallel iteration with unbounded delays (Q2422607) (← links)
- Zeroth-order feedback optimization for cooperative multi-agent systems (Q2682294) (← links)
- ARock: An Algorithmic Framework for Asynchronous Parallel Coordinate Updates (Q2821779) (← links)
- Linear Convergence of Descent Methods for the Unconstrained Minimization of Restricted Strongly Convex Functions (Q2821800) (← links)
- Coordinate descent with arbitrary sampling I: algorithms and complexity<sup>†</sup> (Q2829565) (← links)
- Coordinate descent with arbitrary sampling II: expected separable overapproximation (Q2829566) (← links)
- Distributed Stochastic Optimization with Large Delays (Q5868949) (← links)
- Parameter estimation in a 3‐parameter <i>p</i>‐star random graph model (Q6087121) (← links)
- A new large-scale learning algorithm for generalized additive models (Q6134363) (← links)
- On the Global Convergence of Randomized Coordinate Gradient Descent for Nonconvex Optimization (Q6158001) (← links)
- Convergence of an asynchronous block-coordinate forward-backward algorithm for convex composite optimization (Q6166657) (← links)