Worst-case convergence analysis of inexact gradient and Newton methods through semidefinite programming performance estimation
From MaRDI portal
Publication:5116548
Abstract: We provide new tools for worst-case performance analysis of the gradient (or steepest descent) method of Cauchy for smooth strongly convex functions, and Newton's method for self-concordant functions, including the case of inexact search directions. The analysis uses semidefinite programming performance estimation, as pioneered by Drori and Teboulle [Mathematical Programming, 145(1-2):451-482, 2014], and extends recent performance estimation results for the method of Cauchy by the authors [Optimization Letters, 11(7), 1185-1199, 2017]. To illustrate the applicability of the tools, we demonstrate a novel complexity analysis of short step interior point methods using inexact search directions. As an example in this framework, we sketch how to give a rigorous worst-case complexity analysis of a recent interior point method by Abernethy and Hazan [PMLR, 48:2520-2528, 2016].
Recommendations
- On the worst-case complexity of the gradient method with exact line search for smooth strongly convex functions
- Smooth strongly convex interpolation and exact worst-case performance of first-order methods
- Exact worst-case performance of first-order methods for composite convex optimization
- Principled analyses and design of first-order methods with inexact proximal operators
- Convergence analysis of an inexact infeasible interior point method for semidefinite programming
Cites work
- scientific article; zbMATH DE number 729680 (Why is no real title available?)
- A mathematical view of interior-point methods in convex optimization
- A relationship between the second derivatives of a convex function and of its conjugate
- An extension theorem for convex functions of class \(C^{1,1}\) on Hilbert spaces
- An optimal variant of Kelley's cutting-plane method
- Analysis and design of optimization algorithms via integral quadratic constraints
- Convergence of methods of feasible directions in extremal problems
- Efficient Monte Carlo Procedures for Generating Points Uniformly Distributed over Bounded Regions
- Efficient first-order methods for convex minimization: a constructive approach
- Exact worst-case convergence rates of the proximal gradient method for composite convex minimization
- Exact worst-case performance of first-order methods for composite convex optimization
- First-order methods of smooth convex optimization with inexact oracle
- Inexact proximal Newton methods for self-concordant functions
- Lectures on convex optimization
- On the worst-case complexity of the gradient method with exact line search for smooth strongly convex functions
- Optimized first-order methods for smooth convex minimization
- Performance of first-order methods for smooth convex minimization: a novel approach
- Quantitative estimates of the convergence of the empirical covariance matrix in log-concave ensembles
- Simulated Annealing for Convex Optimization
- Smooth Optimization with Approximate Gradient
- Smooth strongly convex interpolation and exact worst-case performance of first-order methods
- The geometry of logconcave functions and sampling algorithms
Cited in
(18)- Complexity Analysis of a Sampling-Based Interior Point Method for Convex Optimization
- On the properties of convex functions over open sets
- Potential Function-Based Framework for Minimizing Gradients in Convex and Min-Max Optimization
- Conditions for linear convergence of the gradient method for non-convex optimization
- Provably faster gradient descent via long steps
- A Systematic Approach to Lyapunov Analyses of Continuous-Time Models in Convex Optimization
- On the rate of convergence of the difference-of-convex algorithm (DCA)
- Interpolation conditions for linear operators and applications to performance estimation problems
- A frequency-domain analysis of inexact gradient methods
- Factor-\(\sqrt{2}\) acceleration of accelerated gradient methods
- On the worst-case complexity of the gradient method with exact line search for smooth strongly convex functions
- Analysis of optimization algorithms via sum-of-squares
- A note on the optimal convergence rate of descent methods with fixed step sizes for smooth strongly convex functions
- The exact worst-case convergence rate of the gradient method with fixed step lengths for \(L\)-smooth functions
- Optimal step length for the Newton method: case of self-concordant functions
- Branch-and-bound performance estimation programming: a unified methodology for constructing optimal optimization methods
- Optimal step length for the maximal decrease of a self-concordant function by the Newton method
- Principled analyses and design of first-order methods with inexact proximal operators
This page was built for publication: Worst-case convergence analysis of inexact gradient and Newton methods through semidefinite programming performance estimation
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q5116548)