Pages that link to "Item:Q5275297"
From MaRDI portal
The following pages link to Exact Worst-Case Performance of First-Order Methods for Composite Convex Optimization (Q5275297):
Displaying 33 items.
- On the convergence analysis of the optimized gradient method (Q511969) (← links)
- Exact worst-case convergence rates of the proximal gradient method for composite convex minimization (Q1670100) (← links)
- On the worst-case complexity of the gradient method with exact line search for smooth strongly convex functions (Q1679617) (← links)
- Analysis of biased stochastic gradient descent using sequential semidefinite programs (Q2020610) (← links)
- Analysis of optimization algorithms via sum-of-squares (Q2046552) (← links)
- Optimal step length for the Newton method: case of self-concordant functions (Q2067259) (← links)
- Optimal complexity and certification of Bregman first-order methods (Q2149545) (← links)
- A frequency-domain analysis of inexact gradient methods (Q2149575) (← links)
- Efficient first-order methods for convex minimization: a constructive approach (Q2205976) (← links)
- Accelerated proximal point method for maximally monotone operators (Q2235140) (← links)
- Fast gradient descent for convex minimization problems with an oracle producing a \(( \delta, L)\)-model of function at the requested point (Q2278192) (← links)
- The exact worst-case convergence rate of the gradient method with fixed step lengths for \(L\)-smooth functions (Q2673524) (← links)
- Tight Sublinear Convergence Rate of the Proximal Point Algorithm for Maximal Monotone Inclusion Problems (Q3300773) (← links)
- Generalizing the Optimized Gradient Method for Smooth Convex Minimization (Q4571883) (← links)
- Another Look at the Fast Iterative Shrinkage/Thresholding Algorithm (FISTA) (Q4603039) (← links)
- Robust and structure exploiting optimisation algorithms: an integral quadratic constraint approach (Q5012635) (← links)
- Potential Function-Based Framework for Minimizing Gradients in Convex and Min-Max Optimization (Q5093649) (← links)
- Worst-Case Convergence Analysis of Inexact Gradient and Newton Methods Through Semidefinite Programming Performance Estimation (Q5116548) (← links)
- Operator Splitting Performance Estimation: Tight Contraction Factors and Optimal Parameter Selection (Q5123997) (← links)
- Analysis of the gradient method with an Armijo–Wolfe line search on a class of non-smooth convex functions (Q5210738) (← links)
- An optimal gradient method for smooth strongly convex minimization (Q6038652) (← links)
- Factor-\(\sqrt{2}\) acceleration of accelerated gradient methods (Q6073850) (← links)
- A Systematic Approach to Lyapunov Analyses of Continuous-Time Models in Convex Optimization (Q6116244) (← links)
- Branch-and-bound performance estimation programming: a unified methodology for constructing optimal optimization methods (Q6120850) (← links)
- An elementary approach to tight worst case complexity analysis of gradient based methods (Q6165581) (← links)
- Principled analyses and design of first-order methods with inexact proximal operators (Q6165584) (← links)
- Conic linear optimization for computer-assisted proofs. Abstracts from the workshop held April 10--16, 2022 (Q6170529) (← links)
- Provably faster gradient descent via long steps (Q6579999) (← links)
- Incorporating history and deviations in forward-backward splitting (Q6582402) (← links)
- Tight ergodic sublinear convergence rate of the relaxed proximal point algorithm for monotone variational inequalities (Q6596341) (← links)
- On the rate of convergence of the difference-of-convex algorithm (DCA) (Q6596346) (← links)
- PEPIT: computer-assisted worst-case analyses of first-order optimization methods in python (Q6645946) (← links)
- Automated tight Lyapunov analysis for first-order methods (Q6665382) (← links)