Pages that link to "Item:Q5116548"
From MaRDI portal
The following pages link to Worst-Case Convergence Analysis of Inexact Gradient and Newton Methods Through Semidefinite Programming Performance Estimation (Q5116548):
Displaying 14 items.
- Analysis of optimization algorithms via sum-of-squares (Q2046552) (← links)
- Optimal step length for the Newton method: case of self-concordant functions (Q2067259) (← links)
- A frequency-domain analysis of inexact gradient methods (Q2149575) (← links)
- A note on the optimal convergence rate of descent methods with fixed step sizes for smooth strongly convex functions (Q2671453) (← links)
- The exact worst-case convergence rate of the gradient method with fixed step lengths for \(L\)-smooth functions (Q2673524) (← links)
- Complexity Analysis of a Sampling-Based Interior Point Method for Convex Optimization (Q5076724) (← links)
- Potential Function-Based Framework for Minimizing Gradients in Convex and Min-Max Optimization (Q5093649) (← links)
- On the Properties of Convex Functions over Open Sets (Q5856375) (← links)
- Factor-\(\sqrt{2}\) acceleration of accelerated gradient methods (Q6073850) (← links)
- Conditions for linear convergence of the gradient method for non-convex optimization (Q6097482) (← links)
- A Systematic Approach to Lyapunov Analyses of Continuous-Time Models in Convex Optimization (Q6116244) (← links)
- Branch-and-bound performance estimation programming: a unified methodology for constructing optimal optimization methods (Q6120850) (← links)
- Optimal step length for the maximal decrease of a self-concordant function by the Newton method (Q6124345) (← links)
- Principled analyses and design of first-order methods with inexact proximal operators (Q6165584) (← links)