Pages that link to "Item:Q5502117"
From MaRDI portal
The following pages link to An Asynchronous Parallel Stochastic Coordinate Descent Algorithm (Q5502117):
Displaying 49 items.
- On unbounded delays in asynchronous parallel fixed-point algorithms (Q1668724) (← links)
- Stochastic block-coordinate gradient projection algorithms for submodular maximization (Q1723100) (← links)
- Asynchronous parallel primal-dual block coordinate update methods for affinely constrained convex programs (Q1734768) (← links)
- Parallel decomposition methods for linearly constrained problems subject to simple bound with application to the SVMs training (Q1790674) (← links)
- A distributed quantile estimation algorithm of heavy-tailed distribution with massive datasets (Q1980051) (← links)
- Resolving learning rates adaptively by locating stochastic non-negative associated gradient projection points using line searches (Q2022225) (← links)
- Asynchronous parallel algorithms for nonconvex optimization (Q2205974) (← links)
- Worst-case complexity of cyclic coordinate descent: \(O(n^2)\) gap with randomized version (Q2220668) (← links)
- Fully asynchronous stochastic coordinate descent: a tight lower bound on the parallelism achieving linear speedup (Q2235160) (← links)
- Asynchronous Lagrangian scenario decomposition (Q2246185) (← links)
- Coordinate descent algorithms (Q2349114) (← links)
- Linear convergence of first order methods for non-strongly convex optimization (Q2414900) (← links)
- A framework for parallel second order incremental optimization algorithms for solving partially separable problems (Q2419531) (← links)
- On the convergence of asynchronous parallel iteration with unbounded delays (Q2422607) (← links)
- A linearly convergent doubly stochastic Gauss-Seidel algorithm for solving linear equations and a certain class of over-parameterized optimization problems (Q2425182) (← links)
- Convergence of the forward-backward algorithm: beyond the worst-case with the help of geometry (Q2687067) (← links)
- Variance reduction for root-finding problems (Q2689823) (← links)
- On the parallelization upper bound for asynchronous stochastic gradients descent in non-convex optimization (Q2696976) (← links)
- ARock: An Algorithmic Framework for Asynchronous Parallel Coordinate Updates (Q2821779) (← links)
- Optimization in High Dimensions via Accelerated, Parallel, and Proximal Coordinate Descent (Q2832112) (← links)
- Asynchronous Stochastic Coordinate Descent: Parallelism and Convergence Properties (Q2954387) (← links)
- Accelerated, Parallel, and Proximal Coordinate Descent (Q3449571) (← links)
- An Accelerated Randomized Proximal Coordinate Gradient Method and its Application to Regularized Empirical Risk Minimization (Q3451763) (← links)
- Distributed Block Coordinate Descent for Minimizing Partially Separable Functions (Q3462314) (← links)
- Perturbed Iterate Analysis for Asynchronous Stochastic Optimization (Q4588862) (← links)
- Improved asynchronous parallel optimization analysis for stochastic incremental methods (Q4614129) (← links)
- (Q4633026) (← links)
- Decentralized Consensus Algorithm with Delayed and Stochastic Gradients (Q4641647) (← links)
- Optimization Methods for Large-Scale Machine Learning (Q4641709) (← links)
- Distributed Learning with Sparse Communications by Identification (Q4959464) (← links)
- (Q4969198) (← links)
- (Q4969246) (← links)
- A class of smooth exact penalty function methods for optimization problems with orthogonality constraints (Q5058369) (← links)
- An asynchronous inertial algorithm for solving convex feasibility problems with strict pseudo-contractions in Hilbert spaces (Q5065443) (← links)
- On the rates of convergence of parallelized averaged stochastic gradient algorithms (Q5110810) (← links)
- Parallel Stochastic Asynchronous Coordinate Descent: Tight Bounds on the Possible Parallelism (Q5147036) (← links)
- Two Symmetrized Coordinate Descent Methods Can Be $O(n^2)$ Times Slower Than the Randomized Version (Q5162659) (← links)
- Parallelizable Algorithms for Optimization Problems with Orthogonality Constraints (Q5230625) (← links)
- CoordinateWise Descent Methods for Leading Eigenvalue Problem (Q5230665) (← links)
- Accelerate stochastic subgradient method by leveraging local growth condition (Q5236746) (← links)
- Randomized Block Proximal Damped Newton Method for Composite Self-Concordant Minimization (Q5355205) (← links)
- Redundancy Techniques for Straggler Mitigation in Distributed Optimization and Learning (Q5381126) (← links)
- A generic coordinate descent solver for non-smooth convex optimisation (Q5865339) (← links)
- Cyclic Coordinate Dual Averaging with Extrapolation (Q6060151) (← links)
- Parallel and distributed asynchronous adaptive stochastic gradient methods (Q6095736) (← links)
- On the Global Convergence of Randomized Coordinate Gradient Descent for Nonconvex Optimization (Q6158001) (← links)
- Convergence of an asynchronous block-coordinate forward-backward algorithm for convex composite optimization (Q6166657) (← links)
- Distributed adaptive greedy quasi-Newton methods with explicit non-asymptotic convergence bounds (Q6566734) (← links)
- Non-ergodic linear convergence property of the delayed gradient descent under the strongly convexity and the Polyak-Łojasiewicz condition (Q6587593) (← links)