Performance of first-order methods for smooth convex minimization: a novel approach
From MaRDI portal
Publication:2248759
Abstract: We introduce a novel approach for analyzing the performance of first-order black-box optimization methods. We focus on smooth unconstrained convex minimization over the Euclidean space . Our approach relies on the observation that by definition, the worst case behavior of a black-box optimization method is by itself an optimization problem, which we call the Performance Estimation Problem (PEP). We formulate and analyze the PEP for two classes of first-order algorithms. We first apply this approach on the classical gradient method and derive a new and tight analytical bound on its performance. We then consider a broader class of first-order black-box methods, which among others, include the so-called heavy-ball method and the fast gradient schemes. We show that for this broader class, it is possible to derive new numerical bounds on the performance of these methods by solving an adequately relaxed convex semidefinite PEP. Finally, we show an efficient procedure for finding optimal step sizes which results in a first-order black-box method that achieves best performance.
Recommendations
- Smooth strongly convex interpolation and exact worst-case performance of first-order methods
- Optimizing first-order methods for smooth convex minimization of gradient Q-linearly convergence
- Optimizing the efficiency of first-order methods for decreasing the gradient of smooth convex functions
- Optimized first-order methods for smooth convex minimization
- Generalizing the optimized gradient method for smooth convex minimization
Cites work
- scientific article; zbMATH DE number 3850830 (Why is no real title available?)
- scientific article; zbMATH DE number 3790208 (Why is no real title available?)
- scientific article; zbMATH DE number 2107836 (Why is no real title available?)
- A Fast Iterative Shrinkage-Thresholding Algorithm for Linear Inverse Problems
- An Interior-Point Method for Semidefinite Programming
- Convex Optimization in Signal Processing and Communications
- Fine tuning Nesterov's steepest descent algorithm for differentiable convex programming
- Gradient-based algorithms with applications to signal-recovery problems
- Graph implementations for nonsmooth convex programs
- Improved algorithms for convex minimization in relative scale
- Introductory lectures on convex optimization. A basic course.
- Lectures on modern convex optimization. Analysis, algorithms, and engineering applications
- Optimizing properties of an inertial dynamical system with geometric damping. Link with proximal methods
- Primal-dual first-order methods with \({\mathcal {O}(1/\varepsilon)}\) iteration-complexity for cone programming
- Proximité et dualité dans un espace hilbertien
- Quadratic Matrix Programming
- Semidefinite Programming
- Some methods of speeding up the convergence of iteration methods
- THE HEAVY BALL WITH FRICTION METHOD, I. THE CONTINUOUS DYNAMICAL SYSTEM: GLOBAL EXPLORATION OF THE LOCAL MINIMA OF A REAL-VALUED FUNCTION BY ASYMPTOTIC ANALYSIS OF A DISSIPATIVE DYNAMICAL SYSTEM
- Variational Analysis
Cited in
(82)- Subsampled Hessian Newton Methods for Supervised Learning
- On the worst-case complexity of the gradient method with exact line search for smooth strongly convex functions
- Gradient descent technology for sparse vector learning in ontology algorithms
- Worst-case convergence analysis of inexact gradient and Newton methods through semidefinite programming performance estimation
- Data-driven nonsmooth optimization
- Conditions for linear convergence of the gradient method for non-convex optimization
- Fast proximal algorithms for nonsmooth convex optimization
- Another look at the fast iterative shrinkage/thresholding algorithm (FISTA)
- A Systematic Approach to Lyapunov Analyses of Continuous-Time Models in Convex Optimization
- Cyclic schemes for PDE-based image analysis
- Accelerated methods for saddle-point problem
- Inertial proximal alternating linearized minimization (iPALM) for nonconvex and nonsmooth problems
- Generalizing the optimized gradient method for smooth convex minimization
- iPiasco: inertial proximal algorithm for strongly convex optimization
- A stochastic subspace approach to gradient-free optimization in high dimensions
- An optimal variant of Kelley's cutting-plane method
- Fast gradient methods for uniformly convex and weakly smooth problems
- Fast convergence of generalized forward-backward algorithms for structured monotone inclusions
- Accelerated proximal algorithms with a correction term for monotone inclusions
- Regularized nonlinear acceleration
- Optimizing the efficiency of first-order methods for the distance to an optimal solution of smooth convex functions
- Smooth strongly convex interpolation and exact worst-case performance of first-order methods
- Optimal error bounds for non-expansive fixed-point iterations in normed spaces
- Exact worst-case performance of first-order methods for composite convex optimization
- Proximal Splitting Algorithms for Convex Optimization: A Tour of Recent Advances, with New Twists
- Analysis and design of optimization algorithms via integral quadratic constraints
- Accelerated additive Schwarz methods for convex optimization with adaptive restart
- Analysis of biased stochastic gradient descent using sequential semidefinite programs
- Bounds for the tracking error of first-order online optimization methods
- On the oracle complexity of first-order and derivative-free algorithms for smooth nonconvex minimization
- Rate of convergence of the Nesterov accelerated gradient method in the subcritical case α ≤ 3
- Operator splitting performance estimation: tight contraction factors and optimal parameter selection
- Synthesis of accelerated gradient algorithms for optimization and saddle point problems using Lyapunov functions and LMIs
- Solving inverse problems using data-driven models
- Tight sublinear convergence rate of the proximal point algorithm for maximal monotone inclusion problems
- Backward-forward-reflected-backward splitting for three operator monotone inclusions
- New analysis of linear convergence of gradient-type methods via unifying error bound conditions
- Finding the forward-Douglas-Rachford-forward method
- Optimal deterministic algorithm generation
- Convergence rate of a relaxed inertial proximal algorithm for convex minimization
- Accelerated proximal point method for maximally monotone operators
- Optimized first-order methods for smooth convex minimization
- Rate of convergence of inertial gradient dynamics with time-dependent viscous damping coefficient
- scientific article; zbMATH DE number 7363383 (Why is no real title available?)
- Robust and structure exploiting optimisation algorithms: an integral quadratic constraint approach
- Performance of first- and second-order methods for \(\ell_1\)-regularized least squares problems
- New computational guarantees for solving convex optimization problems with first order methods, via a function growth condition measure
- An optimal gradient method for smooth strongly convex minimization
- Analysis of optimization algorithms via integral quadratic constraints: nonstrongly convex problems
- The exact worst-case convergence rate of the gradient method with fixed step lengths for \(L\)-smooth functions
- Efficient first-order methods for convex minimization: a constructive approach
- The exact information-based complexity of smooth convex minimization
- On the convergence analysis of the optimized gradient method
- Adaptive restart of the optimized gradient method for convex optimization
- Exact worst-case convergence rates of the proximal gradient method for composite convex minimization
- A frequency-domain analysis of inexact gradient methods
- Optimal complexity and certification of Bregman first-order methods
- Optimizing the efficiency of first-order methods for decreasing the gradient of smooth convex functions
- An optimal high-order tensor method for convex optimization
- Analysis of optimization algorithms via sum-of-squares
- scientific article; zbMATH DE number 7370630 (Why is no real title available?)
- Two new splitting methods for three-operator monotone inclusions in Hilbert spaces
- Accelerated forward–backward algorithms for structured monotone inclusions
- Interpolation conditions for linear operators and applications to performance estimation problems
- Principled analyses and design of first-order methods with inexact proximal operators
- Provably faster gradient descent via long steps
- Convergence rate analysis of the gradient descent–ascent method for convex–concave saddle-point problems
- The exact worst-case convergence rate of the alternating direction method of multipliers
- Multiply accelerated value iteration for nonsymmetric affine fixed point problems and application to Markov decision processes
- Generalized proximal point algorithms with correction terms and extrapolation
- On the rate of convergence of the difference-of-convex algorithm (DCA)
- Tight ergodic sublinear convergence rate of the relaxed proximal point algorithm for monotone variational inequalities
- PEPIT: computer-assisted worst-case analyses of first-order optimization methods in python
- Conic linear optimization for computer-assisted proofs. Abstracts from the workshop held April 10--16, 2022
- An elementary approach to tight worst case complexity analysis of gradient based methods
- Automated tight Lyapunov analysis for first-order methods
- Branch-and-bound performance estimation programming: a unified methodology for constructing optimal optimization methods
- Optimal step length for the maximal decrease of a self-concordant function by the Newton method
- Several kinds of acceleration techniques for unconstrained optimization first-order algorithms
- Optimizing first-order methods for smooth convex minimization of gradient Q-linearly convergence
- Potential Function-Based Framework for Minimizing Gradients in Convex and Min-Max Optimization
- Factor-\(\sqrt{2}\) acceleration of accelerated gradient methods
This page was built for publication: Performance of first-order methods for smooth convex minimization: a novel approach
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2248759)