An optimal gradient method for smooth strongly convex minimization
From MaRDI portal
Publication:6038652
DOI10.1007/S10107-022-01839-YzbMATH Open1518.90071arXiv2101.09741MaRDI QIDQ6038652FDOQ6038652
Publication date: 2 May 2023
Published in: Mathematical Programming. Series A. Series B (Search for Journal in Brave)
Abstract: We present an optimal gradient method for smooth strongly convex optimization. The method is optimal in the sense that its worst-case bound on the distance to an optimal point exactly matches the lower bound on the oracle complexity for the class of problems, meaning that no black-box first-order method can have a better worst-case guarantee without further assumptions on the class of problems at hand. In addition, we provide a constructive recipe for obtaining the algorithmic parameters of the method and illustrate that it can be used for deriving methods for other optimality criteria as well.
Full work available at URL: https://arxiv.org/abs/2101.09741
Recommendations
- Generalizing the optimized gradient method for smooth convex minimization
- On the convergence analysis of the optimized gradient method
- Universal gradient methods for convex optimization problems
- Optimized first-order methods for smooth convex minimization
- Optimizing the efficiency of first-order methods for decreasing the gradient of smooth convex functions
Cites Work
- A Fast Iterative Shrinkage-Thresholding Algorithm for Linear Inverse Problems
- Introductory lectures on convex optimization. A basic course.
- Title not available (Why is that?)
- Analysis and Design of Optimization Algorithms via Integral Quadratic Constraints
- Title not available (Why is that?)
- Performance of first-order methods for smooth convex minimization: a novel approach
- Optimized first-order methods for smooth convex minimization
- Smooth strongly convex interpolation and exact worst-case performance of first-order methods
- Efficient first-order methods for convex minimization: a constructive approach
- Exact Worst-Case Performance of First-Order Methods for Composite Convex Optimization
- Universal method for stochastic composite optimization problems
- Information-based complexity of linear operator equations
- On the oracle complexity of smooth strongly convex minimization
- The exact information-based complexity of smooth convex minimization
- On the convergence analysis of the optimized gradient method
- Optimizing the efficiency of first-order methods for decreasing the gradient of smooth convex functions
- On the worst-case complexity of the gradient method with exact line search for smooth strongly convex functions
- Title not available (Why is that?)
- Title not available (Why is that?)
Cited In (14)
- Potential Function-Based Framework for Minimizing Gradients in Convex and Min-Max Optimization
- Title not available (Why is that?)
- An elementary approach to tight worst case complexity analysis of gradient based methods
- Optimal methods of smooth convex minimization
- Optimal Affine-Invariant Smooth Minimization Algorithms
- On the oracle complexity of smooth strongly convex minimization
- Provably faster gradient descent via long steps
- Conic linear optimization for computer-assisted proofs. Abstracts from the workshop held April 10--16, 2022
- Interpolation conditions for linear operators and applications to performance estimation problems
- PEPIT: computer-assisted worst-case analyses of first-order optimization methods in python
- A recovered gradient method applied to smooth optimal shape problems
- Factor-\(\sqrt{2}\) acceleration of accelerated gradient methods
- Accelerated minimax algorithms flock together
- Branch-and-bound performance estimation programming: a unified methodology for constructing optimal optimization methods
This page was built for publication: An optimal gradient method for smooth strongly convex minimization
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q6038652)