Pages that link to "Item:Q3451763"
From MaRDI portal
The following pages link to An Accelerated Randomized Proximal Coordinate Gradient Method and its Application to Regularized Empirical Risk Minimization (Q3451763):
Displaying 37 items.
- A penalty method for rank minimization problems in symmetric matrices (Q1616933) (← links)
- An optimal randomized incremental gradient method (Q1785198) (← links)
- Generalized stochastic Frank-Wolfe algorithm with stochastic ``substitute'' gradient for structured convex optimization (Q2020608) (← links)
- Acceleration of primal-dual methods by preconditioning and simple subproblem procedures (Q2027970) (← links)
- On maximum residual block and two-step Gauss-Seidel algorithms for linear least-squares problems (Q2041935) (← links)
- An adaptive Polyak heavy-ball method (Q2102380) (← links)
- Block-coordinate and incremental aggregated proximal gradient methods for nonsmooth nonconvex problems (Q2133414) (← links)
- Worst-case complexity of cyclic coordinate descent: \(O(n^2)\) gap with randomized version (Q2220668) (← links)
- Restarting the accelerated coordinate descent method with a rough strong convexity estimate (Q2301128) (← links)
- Coordinate descent algorithms (Q2349114) (← links)
- A parallel line search subspace correction method for composite convex optimization (Q2516372) (← links)
- An Efficient Inexact ABCD Method for Least Squares Semidefinite Programming (Q2805705) (← links)
- Coordinate descent with arbitrary sampling I: algorithms and complexity<sup>†</sup> (Q2829565) (← links)
- Coordinate descent with arbitrary sampling II: expected separable overapproximation (Q2829566) (← links)
- Optimization in High Dimensions via Accelerated, Parallel, and Proximal Coordinate Descent (Q2832112) (← links)
- MAGMA: Multilevel Accelerated Gradient Mirror Descent Algorithm for Large-Scale Convex Composite Minimization (Q3179624) (← links)
- A Randomized Coordinate Descent Method with Volume Sampling (Q3300772) (← links)
- Accelerated, Parallel, and Proximal Coordinate Descent (Q3449571) (← links)
- (Q4558169) (← links)
- Catalyst Acceleration for First-order Convex Optimization: from Theory to Practice (Q4558545) (← links)
- (Q4558559) (← links)
- (Q4633055) (← links)
- Stochastic Primal-Dual Coordinate Method for Regularized Empirical Risk Minimization (Q4636997) (← links)
- (Q4637040) (← links)
- A remark on accelerated block coordinate descent for computing the proximity operators of a sum of convex functions (Q4967326) (← links)
- On the Complexity Analysis of the Primal Solutions for the Accelerated Randomized Dual Coordinate Ascent (Q4969070) (← links)
- A Subspace Acceleration Method for Minimization Involving a Group Sparsity-Inducing Regularizer (Q5072590) (← links)
- Two Symmetrized Coordinate Descent Methods Can Be $O(n^2)$ Times Slower Than the Randomized Version (Q5162659) (← links)
- An Optimal Algorithm for Decentralized Finite-Sum Optimization (Q5162661) (← links)
- (Q5214258) (← links)
- CoordinateWise Descent Methods for Leading Eigenvalue Problem (Q5230665) (← links)
- Randomized Block Proximal Damped Newton Method for Composite Self-Concordant Minimization (Q5355205) (← links)
- An introduction to continuous optimization for imaging (Q5740077) (← links)
- Accelerated First-Order Methods for Convex Optimization with Locally Lipschitz Continuous Gradient (Q6046830) (← links)
- An aggressive reduction on the complexity of optimization for non-strongly convex objectives (Q6052286) (← links)
- Cyclic Coordinate Dual Averaging with Extrapolation (Q6060151) (← links)
- Iteration-Complexity of First-Order Augmented Lagrangian Methods for Convex Conic Programming (Q6161315) (← links)