Sharpness, Restart, and Acceleration
From MaRDI portal
Publication:5210521
DOI10.1137/18M1224568zbMath1435.90109arXiv1702.03828MaRDI QIDQ5210521
Alexandre d'Aspremont, Vincent Roulet
Publication date: 21 January 2020
Published in: SIAM Journal on Optimization (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1702.03828
Related Items (18)
Additive Schwarz methods for convex optimization with backtracking ⋮ Convergence results of a new monotone inertial forward-backward splitting algorithm under the local Hölder error bound condition ⋮ WARPd: A Linearly Convergent First-Order Primal-Dual Algorithm for Inverse Problems with Approximate Sharpness Conditions ⋮ Accelerated additive Schwarz methods for convex optimization with adaptive restart ⋮ Fast gradient methods for uniformly convex and weakly smooth problems ⋮ Practical perspectives on symplectic accelerated optimization ⋮ Robust obstacle reconstruction in an elastic medium ⋮ Accelerated sparse recovery via gradient descent with nonlinear conjugate gradient momentum ⋮ Radial duality. II: Applications and algorithms ⋮ A class of modified accelerated proximal gradient methods for nonsmooth and nonconvex minimization problems ⋮ Parking on supercritical Galton-Watson trees ⋮ Faster first-order primal-dual methods for linear programming using restarts and sharpness ⋮ On optimal universal first-order methods for minimizing heterogeneous sums ⋮ A simple nearly optimal restart scheme for speeding up first-order methods ⋮ General Hölder smooth convergence rates follow from specialized rates assuming growth bounds ⋮ NESTANets: stable, accurate and efficient neural networks for analysis-sparse inverse problems ⋮ Nearly optimal first-order methods for convex optimization under gradient norm measure: an adaptive regularization approach ⋮ Restarting Frank-Wolfe: faster rates under Hölderian error bounds
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Smooth minimization of non-smooth functions
- Gradient methods for minimizing composite functions
- Proximal alternating linearized minimization for nonconvex and nonsmooth problems
- First-order algorithm with \({\mathcal{O}(\ln(1/\epsilon))}\) convergence for \({\epsilon}\)-equilibrium in two-person zero-sum games
- Universal gradient methods for convex optimization problems
- Semianalytic and subanalytic sets
- On semi- and subanalytic geometry
- Introductory lectures on convex optimization. A basic course.
- New computational guarantees for solving convex optimization problems with first order methods, via a function growth condition measure
- From error bounds to the complexity of first-order descent methods for convex functions
- Restarting the accelerated coordinate descent method with a rough strong convexity estimate
- Splitting methods with variable metric for Kurdyka-Łojasiewicz functions and general convergence rates
- Adaptive restart for accelerated gradient schemes
- A simple nearly optimal restart scheme for speeding up first-order methods
- Smoothing and First Order Methods: A Unified Framework
- Weak Sharp Minima in Mathematical Programming
- Proximal Alternating Minimization and Projection Methods for Nonconvex Problems: An Approach Based on the Kurdyka-Łojasiewicz Inequality
- A Condition Number for Differentiable Convex Inequalities
- Optimal methods of smooth convex minimization
- Global Regularity Theorems
- An Application of Error Bounds for Convex Programming in a Linear Space
- Relatively Smooth Convex Optimization by First-Order Methods, and Applications
- Computational complexity versus statistical performance on sparse recovery problems
- Error Bounds, Quadratic Growth, and Linear Convergence of Proximal Methods
- Adaptive restart of accelerated gradient methods under local quadratic growth condition
- The Łojasiewicz Inequality for Nonsmooth Subanalytic Functions with Applications to Subgradient Dynamical Systems
- A Descent Lemma Beyond Lipschitz Gradient Continuity: First-Order Methods Revisited and Applications
This page was built for publication: Sharpness, Restart, and Acceleration