An optimal gradient method for smooth strongly convex minimization
From MaRDI portal
Publication:6038652
DOI10.1007/S10107-022-01839-YzbMATH Open1518.90071arXiv2101.09741MaRDI QIDQ6038652FDOQ6038652
Authors: Adrien B. Taylor, Yoel Drori
Publication date: 2 May 2023
Published in: Mathematical Programming. Series A. Series B (Search for Journal in Brave)
Abstract: We present an optimal gradient method for smooth strongly convex optimization. The method is optimal in the sense that its worst-case bound on the distance to an optimal point exactly matches the lower bound on the oracle complexity for the class of problems, meaning that no black-box first-order method can have a better worst-case guarantee without further assumptions on the class of problems at hand. In addition, we provide a constructive recipe for obtaining the algorithmic parameters of the method and illustrate that it can be used for deriving methods for other optimality criteria as well.
Full work available at URL: https://arxiv.org/abs/2101.09741
Recommendations
- Generalizing the optimized gradient method for smooth convex minimization
- On the convergence analysis of the optimized gradient method
- Universal gradient methods for convex optimization problems
- Optimized first-order methods for smooth convex minimization
- Optimizing the efficiency of first-order methods for decreasing the gradient of smooth convex functions
Cites Work
- A Fast Iterative Shrinkage-Thresholding Algorithm for Linear Inverse Problems
- Introductory lectures on convex optimization. A basic course.
- Title not available (Why is that?)
- Analysis and design of optimization algorithms via integral quadratic constraints
- Title not available (Why is that?)
- Performance of first-order methods for smooth convex minimization: a novel approach
- Optimized first-order methods for smooth convex minimization
- Smooth strongly convex interpolation and exact worst-case performance of first-order methods
- Efficient first-order methods for convex minimization: a constructive approach
- Exact worst-case performance of first-order methods for composite convex optimization
- Universal method for stochastic composite optimization problems
- Information-based complexity of linear operator equations
- On the oracle complexity of smooth strongly convex minimization
- The exact information-based complexity of smooth convex minimization
- On the convergence analysis of the optimized gradient method
- Optimizing the efficiency of first-order methods for decreasing the gradient of smooth convex functions
- On the worst-case complexity of the gradient method with exact line search for smooth strongly convex functions
- Potential-function proofs for gradient methods
- Title not available (Why is that?)
Cited In (33)
- Dynamic smoothness parameter for fast gradient methods
- Explicit stabilised gradient descent for faster strongly convex optimisation
- Potential Function-Based Framework for Minimizing Gradients in Convex and Min-Max Optimization
- Title not available (Why is that?)
- An elementary approach to tight worst case complexity analysis of gradient based methods
- Optimal methods of smooth convex minimization
- Optimal Affine-Invariant Smooth Minimization Algorithms
- On the oracle complexity of smooth strongly convex minimization
- On lower and upper bounds in smooth and strongly convex optimization
- Provably faster gradient descent via long steps
- Relatively smooth convex optimization by first-order methods, and applications
- Oracle complexity of second-order methods for smooth convex optimization
- Generalizing the optimized gradient method for smooth convex minimization
- Conic linear optimization for computer-assisted proofs. Abstracts from the workshop held April 10--16, 2022
- Interpolation conditions for linear operators and applications to performance estimation problems
- Robust accelerated gradient methods for smooth strongly convex functions
- PEPIT: computer-assisted worst-case analyses of first-order optimization methods in python
- Nearly optimal first-order methods for convex optimization under gradient norm measure: an adaptive regularization approach
- An optimal first order method based on optimal quadratic averaging
- A recovered gradient method applied to smooth optimal shape problems
- New results on subgradient methods for strongly convex optimization problems with a unified analysis
- Efficient first-order methods for convex minimization: a constructive approach
- Optimal complexity and certification of Bregman first-order methods
- A first order method for finding minimal norm-like solutions of convex optimization problems
- Factor-\(\sqrt{2}\) acceleration of accelerated gradient methods
- An acceleration procedure for optimal first-order methods
- Fast and safe: accelerated gradient methods with optimality certificates and underestimate sequences
- On the worst-case complexity of the gradient method with exact line search for smooth strongly convex functions
- Gradient methods with memory
- Smooth Optimization with Approximate Gradient
- Accelerated minimax algorithms flock together
- Fast gradient methods for uniformly convex and weakly smooth problems
- Branch-and-bound performance estimation programming: a unified methodology for constructing optimal optimization methods
This page was built for publication: An optimal gradient method for smooth strongly convex minimization
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q6038652)