Pages that link to "Item:Q2452370"
From MaRDI portal
The following pages link to Iteration complexity of randomized block-coordinate descent methods for minimizing a composite function (Q2452370):
Displaying 50 items.
- Block-coordinate and incremental aggregated proximal gradient methods for nonsmooth nonconvex problems (Q2133414) (← links)
- Parallel random block-coordinate forward-backward algorithm: a unified convergence analysis (Q2133415) (← links)
- Non-convex low-rank representation combined with rank-one matrix sum for subspace clustering (Q2156581) (← links)
- Multi-label optimal margin distribution machine (Q2183598) (← links)
- Synchronous parallel block coordinate descent method for nonsmooth convex function minimization (Q2200102) (← links)
- Worst-case complexity of cyclic coordinate descent: \(O(n^2)\) gap with randomized version (Q2220668) (← links)
- Fully asynchronous stochastic coordinate descent: a tight lower bound on the parallelism achieving linear speedup (Q2235160) (← links)
- An efficient Hessian based algorithm for solving large-scale sparse group Lasso problems (Q2288192) (← links)
- Restarting the accelerated coordinate descent method with a rough strong convexity estimate (Q2301128) (← links)
- Variant of greedy randomized Kaczmarz for ridge regression (Q2311776) (← links)
- Randomized primal-dual proximal block coordinate updates (Q2314059) (← links)
- Randomized and fault-tolerant method of subspace corrections (Q2319625) (← links)
- Convergence analysis for Kaczmarz-type methods in a Hilbert space framework (Q2347471) (← links)
- Coordinate descent algorithms (Q2349114) (← links)
- Parallel block coordinate minimization with application to group regularized regression (Q2358088) (← links)
- Performance of first- and second-order methods for \(\ell_1\)-regularized least squares problems (Q2374363) (← links)
- Random block coordinate descent methods for linearly constrained optimization over networks (Q2401516) (← links)
- A block symmetric Gauss-Seidel decomposition theorem for convex composite quadratic programming and its applications (Q2414911) (← links)
- Accelerating block coordinate descent methods with identification strategies (Q2419524) (← links)
- A random block-coordinate Douglas-Rachford splitting method with low computational complexity for binary logistic regression (Q2419533) (← links)
- On the convergence of asynchronous parallel iteration with unbounded delays (Q2422607) (← links)
- First-order methods almost always avoid strict saddle points (Q2425175) (← links)
- A parallel line search subspace correction method for composite convex optimization (Q2516372) (← links)
- Distributed constraint-coupled optimization via primal decomposition over random time-varying graphs (Q2665379) (← links)
- Poisoning finite-horizon Markov decision processes at design time (Q2668608) (← links)
- On multi-step greedy randomized coordinate descent method for solving large linear least-squares problems (Q2686517) (← links)
- A Fast Active Set Block Coordinate Descent Algorithm for $\ell_1$-Regularized Least Squares (Q2796799) (← links)
- Computational Methods for Solving Nonconvex Block-Separable Constrained Quadratic Problems (Q2810546) (← links)
- A second-order method for convex<sub>1</sub>-regularized optimization with active-set prediction (Q2815550) (← links)
- Coordinate descent with arbitrary sampling I: algorithms and complexity<sup>†</sup> (Q2829565) (← links)
- Coordinate descent with arbitrary sampling II: expected separable overapproximation (Q2829566) (← links)
- Optimization in High Dimensions via Accelerated, Parallel, and Proximal Coordinate Descent (Q2832112) (← links)
- Separable approximations and decomposition methods for the augmented Lagrangian (Q2943840) (← links)
- Block Stochastic Gradient Iteration for Convex and Nonconvex Optimization (Q2945126) (← links)
- Asynchronous Stochastic Coordinate Descent: Parallelism and Convergence Properties (Q2954387) (← links)
- Stochastic Block Mirror Descent Methods for Nonsmooth and Stochastic Optimization (Q2954396) (← links)
- A Randomized Coordinate Descent Method with Volume Sampling (Q3300772) (← links)
- A Block Successive Upper-Bound Minimization Method of Multipliers for Linearly Constrained Convex Optimization (Q3387904) (← links)
- Iterative Proportional Scaling Revisited: A Modern Optimization Perspective (Q3391182) (← links)
- Accelerated, Parallel, and Proximal Coordinate Descent (Q3449571) (← links)
- The Cyclic Block Conditional Gradient Method for Convex Optimization Problems (Q3449572) (← links)
- An Accelerated Randomized Proximal Coordinate Gradient Method and its Application to Regularized Empirical Risk Minimization (Q3451763) (← links)
- Convergence Properties of the Randomized Extended Gauss--Seidel and Kaczmarz Methods (Q3456873) (← links)
- Distributed Block Coordinate Descent for Minimizing Partially Separable Functions (Q3462314) (← links)
- Parallel Random Coordinate Descent Method for Composite Minimization: Convergence Analysis and Error Bounds (Q3465244) (← links)
- (Q4558169) (← links)
- On Faster Convergence of Cyclic Block Coordinate Descent-type Methods for Strongly Convex Minimization (Q4558510) (← links)
- Catalyst Acceleration for First-order Convex Optimization: from Theory to Practice (Q4558545) (← links)
- A Randomized Nonmonotone Block Proximal Gradient Method for a Class of Structured Nonlinear Programming (Q4596724) (← links)
- Avoiding Communication in Primal and Dual Block Coordinate Descent Methods (Q4613501) (← links)