Performance of first-order methods for smooth convex minimization: a novel approach
From MaRDI portal
Publication:2248759
Abstract: We introduce a novel approach for analyzing the performance of first-order black-box optimization methods. We focus on smooth unconstrained convex minimization over the Euclidean space . Our approach relies on the observation that by definition, the worst case behavior of a black-box optimization method is by itself an optimization problem, which we call the Performance Estimation Problem (PEP). We formulate and analyze the PEP for two classes of first-order algorithms. We first apply this approach on the classical gradient method and derive a new and tight analytical bound on its performance. We then consider a broader class of first-order black-box methods, which among others, include the so-called heavy-ball method and the fast gradient schemes. We show that for this broader class, it is possible to derive new numerical bounds on the performance of these methods by solving an adequately relaxed convex semidefinite PEP. Finally, we show an efficient procedure for finding optimal step sizes which results in a first-order black-box method that achieves best performance.
Recommendations
- Smooth strongly convex interpolation and exact worst-case performance of first-order methods
- Optimizing first-order methods for smooth convex minimization of gradient Q-linearly convergence
- Optimizing the efficiency of first-order methods for decreasing the gradient of smooth convex functions
- Optimized first-order methods for smooth convex minimization
- Generalizing the optimized gradient method for smooth convex minimization
Cites work
- scientific article; zbMATH DE number 3850830 (Why is no real title available?)
- scientific article; zbMATH DE number 3790208 (Why is no real title available?)
- scientific article; zbMATH DE number 2107836 (Why is no real title available?)
- A Fast Iterative Shrinkage-Thresholding Algorithm for Linear Inverse Problems
- An Interior-Point Method for Semidefinite Programming
- Convex Optimization in Signal Processing and Communications
- Fine tuning Nesterov's steepest descent algorithm for differentiable convex programming
- Gradient-based algorithms with applications to signal-recovery problems
- Graph implementations for nonsmooth convex programs
- Improved algorithms for convex minimization in relative scale
- Introductory lectures on convex optimization. A basic course.
- Lectures on modern convex optimization. Analysis, algorithms, and engineering applications
- Optimizing properties of an inertial dynamical system with geometric damping. Link with proximal methods
- Primal-dual first-order methods with \({\mathcal {O}(1/\varepsilon)}\) iteration-complexity for cone programming
- Proximité et dualité dans un espace hilbertien
- Quadratic Matrix Programming
- Semidefinite Programming
- Some methods of speeding up the convergence of iteration methods
- THE HEAVY BALL WITH FRICTION METHOD, I. THE CONTINUOUS DYNAMICAL SYSTEM: GLOBAL EXPLORATION OF THE LOCAL MINIMA OF A REAL-VALUED FUNCTION BY ASYMPTOTIC ANALYSIS OF A DISSIPATIVE DYNAMICAL SYSTEM
- Variational Analysis
Cited in
(82)- Multiply accelerated value iteration for nonsymmetric affine fixed point problems and application to Markov decision processes
- Robust and structure exploiting optimisation algorithms: an integral quadratic constraint approach
- An optimal high-order tensor method for convex optimization
- Accelerated proximal algorithms with a correction term for monotone inclusions
- Potential Function-Based Framework for Minimizing Gradients in Convex and Min-Max Optimization
- Analysis and design of optimization algorithms via integral quadratic constraints
- Conditions for linear convergence of the gradient method for non-convex optimization
- Optimal deterministic algorithm generation
- Tight sublinear convergence rate of the proximal point algorithm for maximal monotone inclusion problems
- An elementary approach to tight worst case complexity analysis of gradient based methods
- Fast convergence of generalized forward-backward algorithms for structured monotone inclusions
- Provably faster gradient descent via long steps
- Inertial proximal alternating linearized minimization (iPALM) for nonconvex and nonsmooth problems
- Data-driven nonsmooth optimization
- A Systematic Approach to Lyapunov Analyses of Continuous-Time Models in Convex Optimization
- Another look at the fast iterative shrinkage/thresholding algorithm (FISTA)
- Generalizing the optimized gradient method for smooth convex minimization
- Operator splitting performance estimation: tight contraction factors and optimal parameter selection
- On the rate of convergence of the difference-of-convex algorithm (DCA)
- Tight ergodic sublinear convergence rate of the relaxed proximal point algorithm for monotone variational inequalities
- scientific article; zbMATH DE number 7370630 (Why is no real title available?)
- The exact worst-case convergence rate of the alternating direction method of multipliers
- Accelerated additive Schwarz methods for convex optimization with adaptive restart
- Backward-forward-reflected-backward splitting for three operator monotone inclusions
- Conic linear optimization for computer-assisted proofs. Abstracts from the workshop held April 10--16, 2022
- On the oracle complexity of first-order and derivative-free algorithms for smooth nonconvex minimization
- Exact worst-case performance of first-order methods for composite convex optimization
- Optimal error bounds for non-expansive fixed-point iterations in normed spaces
- Proximal Splitting Algorithms for Convex Optimization: A Tour of Recent Advances, with New Twists
- Convergence rate of a relaxed inertial proximal algorithm for convex minimization
- Interpolation conditions for linear operators and applications to performance estimation problems
- Bounds for the tracking error of first-order online optimization methods
- PEPIT: computer-assisted worst-case analyses of first-order optimization methods in python
- Finding the forward-Douglas-Rachford-forward method
- Optimized first-order methods for smooth convex minimization
- Efficient first-order methods for convex minimization: a constructive approach
- A frequency-domain analysis of inexact gradient methods
- Optimal complexity and certification of Bregman first-order methods
- Two new splitting methods for three-operator monotone inclusions in Hilbert spaces
- Smooth strongly convex interpolation and exact worst-case performance of first-order methods
- Synthesis of accelerated gradient algorithms for optimization and saddle point problems using Lyapunov functions and LMIs
- A stochastic subspace approach to gradient-free optimization in high dimensions
- New analysis of linear convergence of gradient-type methods via unifying error bound conditions
- Factor-\(\sqrt{2}\) acceleration of accelerated gradient methods
- Performance of first- and second-order methods for \(\ell_1\)-regularized least squares problems
- Analysis of optimization algorithms via integral quadratic constraints: nonstrongly convex problems
- Worst-case convergence analysis of inexact gradient and Newton methods through semidefinite programming performance estimation
- Adaptive restart of the optimized gradient method for convex optimization
- Exact worst-case convergence rates of the proximal gradient method for composite convex minimization
- On the worst-case complexity of the gradient method with exact line search for smooth strongly convex functions
- Analysis of optimization algorithms via sum-of-squares
- Optimizing the efficiency of first-order methods for the distance to an optimal solution of smooth convex functions
- Optimizing the efficiency of first-order methods for decreasing the gradient of smooth convex functions
- Optimizing first-order methods for smooth convex minimization of gradient Q-linearly convergence
- An optimal variant of Kelley's cutting-plane method
- The exact worst-case convergence rate of the gradient method with fixed step lengths for \(L\)-smooth functions
- iPiasco: inertial proximal algorithm for strongly convex optimization
- Convergence rate analysis of the gradient descent–ascent method for convex–concave saddle-point problems
- Accelerated forward–backward algorithms for structured monotone inclusions
- Accelerated methods for saddle-point problem
- An optimal gradient method for smooth strongly convex minimization
- Automated tight Lyapunov analysis for first-order methods
- The exact information-based complexity of smooth convex minimization
- On the convergence analysis of the optimized gradient method
- Gradient descent technology for sparse vector learning in ontology algorithms
- Several kinds of acceleration techniques for unconstrained optimization first-order algorithms
- Fast proximal algorithms for nonsmooth convex optimization
- Fast gradient methods for uniformly convex and weakly smooth problems
- Subsampled Hessian Newton Methods for Supervised Learning
- Solving inverse problems using data-driven models
- Regularized nonlinear acceleration
- Generalized proximal point algorithms with correction terms and extrapolation
- Rate of convergence of inertial gradient dynamics with time-dependent viscous damping coefficient
- New computational guarantees for solving convex optimization problems with first order methods, via a function growth condition measure
- Rate of convergence of the Nesterov accelerated gradient method in the subcritical case α ≤ 3
- Cyclic schemes for PDE-based image analysis
- scientific article; zbMATH DE number 7363383 (Why is no real title available?)
- Analysis of biased stochastic gradient descent using sequential semidefinite programs
- Accelerated proximal point method for maximally monotone operators
- Branch-and-bound performance estimation programming: a unified methodology for constructing optimal optimization methods
- Optimal step length for the maximal decrease of a self-concordant function by the Newton method
- Principled analyses and design of first-order methods with inexact proximal operators
This page was built for publication: Performance of first-order methods for smooth convex minimization: a novel approach
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2248759)