Restarting the accelerated coordinate descent method with a rough strong convexity estimate
From MaRDI portal
Publication:2301128
DOI10.1007/s10589-019-00137-2zbMath1432.90109arXiv1803.05771OpenAlexW3100049367WikidataQ127111570 ScholiaQ127111570MaRDI QIDQ2301128
Publication date: 28 February 2020
Published in: Computational Optimization and Applications (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1803.05771
accelerated coordinate descentlocal quadratic error boundrestarting strategiesunknown strong convexity
Related Items
Parallel random block-coordinate forward-backward algorithm: a unified convergence analysis, On the Complexity Analysis of the Primal Solutions for the Accelerated Randomized Dual Coordinate Ascent, Practical perspectives on symplectic accelerated optimization, A class of modified accelerated proximal gradient methods for nonsmooth and nonconvex minimization problems, Quadratic error bound of the smoothed gap and the restarted averaged primal-dual hybrid gradient, First-order methods for convex optimization, A Smooth Primal-Dual Optimization Framework for Nonsmooth Composite Convex Minimization, Convergence rates of an inertial gradient descent algorithm under growth and flatness conditions, Sharpness, Restart, and Acceleration, Restarting the accelerated coordinate descent method with a rough strong convexity estimate, An inexact proximal augmented Lagrangian framework with arbitrary linearly convergent inner solver for composite convex optimization, A piecewise conservative method for unconstrained convex optimization, Generalized self-concordant functions: a recipe for Newton-type methods, A dual approach for optimal algorithms in distributed optimization over networks, A generic coordinate descent solver for non-smooth convex optimisation, Universal intermediate gradient method for convex problems with inexact oracle, Scaled, Inexact, and Adaptive Generalized FISTA for Strongly Convex Optimization, Linear Convergence of Random Dual Coordinate Descent on Nonpolyhedral Convex Problems, Restarting Frank-Wolfe: faster rates under Hölderian error bounds
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- A Fast Iterative Shrinkage-Thresholding Algorithm for Linear Inverse Problems
- Parallel coordinate descent methods for big data optimization
- Gradient methods for minimizing composite functions
- Error bounds and convergence analysis of feasible descent methods: A general approach
- From error bounds to the complexity of first-order descent methods for convex functions
- Restarting the accelerated coordinate descent method with a rough strong convexity estimate
- Smooth minimization of nonsmooth functions with parallel coordinate descent methods
- An adaptive accelerated proximal gradient method and its homotopy continuation for sparse optimization
- Adaptive restart for accelerated gradient schemes
- Linearly convergent away-step conditional gradient for non-strongly convex functions
- Linear convergence of first order methods for non-strongly convex optimization
- Iteration complexity of randomized block-coordinate descent methods for minimizing a composite function
- Pathwise coordinate optimization
- Stochastic intermediate gradient method for convex optimization problems
- Coordinate descent with arbitrary sampling II: expected separable overapproximation
- Efficiency of Coordinate Descent Methods on Huge-Scale Optimization Problems
- Accelerated, Parallel, and Proximal Coordinate Descent
- An Accelerated Randomized Proximal Coordinate Gradient Method and its Application to Regularized Empirical Risk Minimization
- Error bounds for solutions of linear equations and inequalities
- Katyusha: the first direct acceleration of stochastic gradient methods
- Error Bounds, Quadratic Growth, and Linear Convergence of Proximal Methods
- Adaptive restart of accelerated gradient methods under local quadratic growth condition
- Optimal Stochastic Approximation Algorithms for Strongly Convex Stochastic Composite Optimization, II: Shrinking Procedures and Optimal Algorithms