Pages that link to "Item:Q3300773"
From MaRDI portal
The following pages link to Tight Sublinear Convergence Rate of the Proximal Point Algorithm for Maximal Monotone Inclusion Problems (Q3300773):
Displaying 12 items.
- Accelerated proximal point method for maximally monotone operators (Q2235140) (← links)
- A unified and tight linear convergence analysis of the relaxed proximal point algorithm (Q2691350) (← links)
- Potential Function-Based Framework for Minimizing Gradients in Convex and Min-Max Optimization (Q5093649) (← links)
- Factor-\(\sqrt{2}\) acceleration of accelerated gradient methods (Q6073850) (← links)
- Fast Krasnosel’skiĭ–Mann Algorithm with a Convergence Rate of the Fixed Point Iteration of \(\boldsymbol{{ o} \left(\frac{1}{{ k}} \right)}\) (Q6091103) (← links)
- Branch-and-bound performance estimation programming: a unified methodology for constructing optimal optimization methods (Q6120850) (← links)
- Conic linear optimization for computer-assisted proofs. Abstracts from the workshop held April 10--16, 2022 (Q6170529) (← links)
- Generalized proximal point algorithms with correction terms and extrapolation (Q6536963) (← links)
- Provably faster gradient descent via long steps (Q6579999) (← links)
- Tight ergodic sublinear convergence rate of the relaxed proximal point algorithm for monotone variational inequalities (Q6596341) (← links)
- Interpolation conditions for linear operators and applications to performance estimation problems (Q6601207) (← links)
- PEPIT: computer-assisted worst-case analyses of first-order optimization methods in python (Q6645946) (← links)