On the complexity of the hybrid proximal extragradient method for the iterates and the ergodic mean
DOI10.1137/090753127zbMATH Open1230.90200OpenAlexW1966953121WikidataQ105584124 ScholiaQ105584124MaRDI QIDQ3083307FDOQ3083307
Authors: Renato D. C. Monteiro, B. F. Svaiter
Publication date: 21 March 2011
Published in: SIAM Journal on Optimization (Search for Journal in Brave)
Full work available at URL: https://semanticscholar.org/paper/1ce262efacfbe6194deb84d024c10c03082875ba
Recommendations
- A variant of the hybrid proximal extragradient method for solving strongly monotone inclusions and its complexity analysis
- On the convergence analysis of inexact hybrid extragradient proximal point algorithms for maximal monotone operators
- Weak Convergence of a Relaxed and Inertial Hybrid Projection-Proximal Point Algorithm for Maximal Monotone Operators in Hilbert Space
- Iteration-complexity of a Newton proximal extragradient method for monotone variational inequalities and inclusion problems
- A hybrid approximate extragradient-proximal point algorithm using the enlargement of a maximal monotone operator
complexityvariational inequalitymaximal monotone operatorcomplementarity problemsextragradientKorpelevich and Newton methods
Numerical mathematical programming methods (65K05) Numerical optimization and variational techniques (65K10) Convex programming (90C25) Abstract computational complexity for mathematical programming problems (90C60) Complementarity and equilibrium problems and variational inequalities (finite dimensions) (aspects of mathematical programming) (90C33) Monotone operators and generalizations (47H05) Variational and other types of inequalities involving nonlinear operators (general) (47J20)
Cited In (77)
- Complexity of a quadratic penalty accelerated inexact proximal point method for solving linearly constrained nonconvex composite programs
- A variant of the hybrid proximal extragradient method for solving strongly monotone inclusions and its complexity analysis
- Weak versus strong convergence of a regularized Newton dynamic for maximal monotone operators
- Communication-efficient algorithms for decentralized and stochastic optimization
- Iteration-complexity analysis of a generalized alternating direction method of multipliers
- An optimal method for stochastic composite optimization
- A smooth primal-dual optimization framework for nonsmooth composite convex minimization
- Variants of the A-HPE and large-step A-HPE algorithms for strongly convex problems with applications to accelerated high-order tensor methods
- Regularized HPE-Type Methods for Solving Monotone Inclusions with Improved Pointwise Iteration-Complexity Bounds
- A Proximal Bundle Variant with Optimal Iteration-Complexity for a Large Range of Prox Stepsizes
- An accelerated linearized alternating direction method of multipliers
- Complexity of the relaxed Peaceman-Rachford splitting method for the sum of two maximal strongly monotone operators
- A \(\mathcal O(1/k^{3/2})\) hybrid proximal extragradient primal-dual interior point method for nonlinear monotone mixed complementarity problems
- Pointwise and ergodic convergence rates of a variable metric proximal alternating direction method of multipliers
- Local convergence of tensor methods
- Dynamic stochastic approximation for multi-stage stochastic optimization
- Iteration-complexity of a Newton proximal extragradient method for monotone variational inequalities and inclusion problems
- On inexact relative-error hybrid proximal extragradient, forward-backward and Tseng's modified forward-backward methods with inertial effects
- Convergence rate of \(\mathcal{O}(1/k)\) for optimistic gradient and extragradient methods in smooth convex-concave saddle point problems
- Over relaxed hybrid proximal extragradient algorithm and its application to several operator splitting methods
- An efficient algorithm for nonconvex-linear minimax optimization problem and its application in solving weighted maximin dispersion problem
- An extragradient-type algorithm for variational inequality on Hadamard manifolds
- Iteration-complexity of first-order penalty methods for convex programming
- Convergence rates with inexact non-expansive operators
- An \(O(s^r)\)-resolution ODE framework for understanding discrete-time algorithms and applications to the linear convergence of minimax problems
- A proximal-Newton method for unconstrained convex optimization in Hilbert spaces
- On the complexity of a hybrid proximal extragradient projective method for solving monotone inclusion problems
- On the complexity of the projective splitting and Spingarn's methods for the sum of two maximal monotone operators
- Implementation of a block-decomposition algorithm for solving large-scale conic semidefinite programming problems
- A hybrid proximal extragradient self-concordant primal barrier method for monotone variational inequalities
- Accelerating block-decomposition first-order methods for solving composite saddle-point and two-player Nash equilibrium problems
- An accelerated HPE-type algorithm for a class of composite convex-concave saddle-point problems
- Generalized mirror prox algorithm for monotone variational inequalities: Universality and inexact oracle
- A control-theoretic perspective on optimal high-order optimization
- Stochastic relaxed inertial forward-backward-forward splitting for monotone inclusions in Hilbert spaces
- A class of Fejér convergent algorithms, approximate resolvents and the hybrid proximal-extragradient method
- Accelerated schemes for a class of variational inequalities
- Extragradient Method with Variance Reduction for Stochastic Variational Inequalities
- Complexity of first-order inexact Lagrangian and penalty methods for conic convex programming
- A first-order block-decomposition method for solving two-easy-block structured semidefinite programs
- An \(\mathcal O(1/{k})\) convergence rate for the variable stepsize Bregman operator splitting algorithm
- Interior hybrid proximal extragradient methods for the linear monotone complementarity problem
- Maximal monotone inclusions and Fitzpatrick functions
- Non-stationary First-Order Primal-Dual Algorithms with Faster Convergence Rates
- An inexact Spingarn's partial inverse method with applications to operator splitting and composite optimization
- Improved Pointwise Iteration-Complexity of A Regularized ADMM and of a Regularized Non-Euclidean HPE Framework
- Bregman proximal point algorithm revisited: a new inexact version and its inertial variant
- A partially inexact proximal alternating direction method of multipliers and its iteration-complexity analysis
- Iteration complexity of an inexact Douglas-Rachford method and of a Douglas-Rachford-Tseng's F-B four-operator splitting method for solving monotone inclusions
- Tikhonov-like methods with inexact minimization for solving linear ill-posed problems
- A Hybrid Proximal-Extragradient Algorithm with Inertial Effects
- On FISTA with a relative error rule
- Iteration-complexity of a Rockafellar's proximal method of multipliers for convex programming based on second-order approximations
- A majorized ADMM with indefinite proximal terms for linearly constrained convex composite optimization
- On the computation of equilibria in monotone and potential stochastic hierarchical games
- A closed-loop supply chain equilibrium model with random and price-sensitive demand and return
- An extragradient-based alternating direction method for convex minimization
- A projective splitting method for monotone inclusions: iteration-complexity and application to composite optimization
- New primal-dual algorithms for a class of nonsmooth and nonlinear convex-concave minimax problems
- Higher-order methods for convex-concave min-max optimization and monotone variational inequalities
- Simple and optimal methods for stochastic variational inequalities. I: Operator extrapolation
- A partially inexact ADMM with o(1/n) asymptotic convergence rate, 𝒪(1/n) complexity, and immediate relative error tolerance
- Stochastic projective splitting
- On inexact stochastic splitting methods for a class of nonconvex composite optimization problems with relative error
- Complexity-optimal and parameter-free first-order methods for finding stationary points of composite optimization problems
- From Halpern's fixed-point iterations to Nesterov's accelerated interpretations for root-finding problems
- On Korpelevich's extragradient algorithm
- First-order methods for problems with \(O(1)\) functional constraints can have almost the same convergence rate as for unconstrained problems
- On the convergence rate of the scaled proximal decomposition on the graph of a maximal monotone operator (SPDG) algorithm
- A search-free \(O(1/k^{3/2})\) homotopy inexact proximal-Newton extragradient algorithm for monotone variational inequalities
- Last-iterate convergence of saddle-point optimizers via high-resolution differential equations
- Riemannian Hamiltonian Methods for Min-Max Optimization on Manifolds
- Accelerated minimax algorithms flock together
- Perseus: a simple and optimal high-order method for variational inequalities
- Principled analyses and design of first-order methods with inexact proximal operators
- A unified single-loop alternating gradient projection algorithm for nonconvex-concave and convex-nonconcave minimax problems
- On the iteration-complexity of a non-Euclidean hybrid proximal extragradient framework and of a proximal ADMM
This page was built for publication: On the complexity of the hybrid proximal extragradient method for the iterates and the ergodic mean
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q3083307)