Tight ergodic sublinear convergence rate of the relaxed proximal point algorithm for monotone variational inequalities
From MaRDI portal
Publication:6596341
Recommendations
- Tight sublinear convergence rate of the proximal point algorithm for maximal monotone inclusion problems
- Convergence analysis of the relaxed proximal point algorithm
- A unified and tight linear convergence analysis of the relaxed proximal point algorithm
- Convergence analysis of a relaxed extragradient–proximal point algorithm application to variational inequalities
- The PPA-based numerical algorithm with the \(O(1/t)\) convergence rate for variant variational inequalities
Cites work
- scientific article; zbMATH DE number 3341597 (Why is no real title available?)
- Accelerated proximal point method for maximally monotone operators
- An optimal variant of Kelley's cutting-plane method
- Another look at the fast iterative shrinkage/thresholding algorithm (FISTA)
- Convex analysis and monotone operator theory in Hilbert spaces
- Customized proximal point algorithms for linearly constrained convex minimization and saddle-point problems: a unified approach
- Exact worst-case convergence rates of the proximal gradient method for composite convex minimization
- Exact worst-case performance of first-order methods for composite convex optimization
- Finite-Dimensional Variational Inequalities and Complementarity Problems
- Generalizing the optimized gradient method for smooth convex minimization
- Iteration-complexity of block-decomposition algorithms and the alternating direction method of multipliers
- Monotone Operators and the Proximal Point Algorithm
- New Proximal Point Algorithms for Convex Minimization
- On the Convergence of the Proximal Point Algorithm for Convex Minimization
- On the \(O(1/n)\) convergence rate of the Douglas-Rachford alternating direction method
- On the convergence analysis of the optimized gradient method
- On the convergence rate of the Halpern-iteration
- On the worst-case complexity of the gradient method with exact line search for smooth strongly convex functions
- Operator splitting performance estimation: tight contraction factors and optimal parameter selection
- Optimized first-order methods for smooth convex minimization
- Optimizing the efficiency of first-order methods for decreasing the gradient of smooth convex functions
- Performance of first-order methods for smooth convex minimization: a novel approach
- Primal-dual subgradient methods for convex problems
- Proximité et dualité dans un espace hilbertien
- Quadratic Matrix Programming
- Smooth strongly convex interpolation and exact worst-case performance of first-order methods
- The proximal point algorithm with genuine superlinear convergence for the monotone complementarity problem
- Tight sublinear convergence rate of the proximal point algorithm for maximal monotone inclusion problems
- Using SeDuMi 1.02, A Matlab toolbox for optimization over symmetric cones
Cited in
(1)
This page was built for publication: Tight ergodic sublinear convergence rate of the relaxed proximal point algorithm for monotone variational inequalities
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q6596341)