Extragradient-Type Methods with \mathcal{O} (1/k) Last-Iterate Convergence Rates for Co-Hypomonotone Inclusions
From MaRDI portal
Publication:6425841
Abstract: In this paper, we develop two ``Nesterov's accelerated variants of the well-known extragradient method to approximate a solution of a co-hypomonotone inclusion constituted by the sum of two operators, where one is Lipschitz continuous and the other is possibly multivalued. The first scheme can be viewed as an accelerated variant of Tseng's forward-backward-forward splitting method, while the second one is a variant of the reflected forward-backward splitting method, which requires only one evaluation of the Lipschitz operator, and one resolvent of the multivalued operator. Under a proper choice of the algorithmic parameters and appropriate conditions on the co-hypomonotone parameter, we theoretically prove that both algorithms achieve convergence rates on the norm of the residual, where is the iteration counter. Our results can be viewed as alternatives of a recent class of Halpern-type schemes for root-finding problems.
Recommendations
- From Halpern's fixed-point iterations to Nesterov's accelerated interpretations for root-finding problems
- A variant of the hybrid proximal extragradient method for solving strongly monotone inclusions and its complexity analysis
- On inexact relative-error hybrid proximal extragradient, forward-backward and Tseng's modified forward-backward methods with inertial effects
- A inertial hybrid proximal extragradient method for solving monotone inclusions
- An inertial forward-backward algorithm for monotone inclusions
Cited in
(1)
This page was built for publication: Extragradient-Type Methods with $\mathcal{O} (1/k)$ Last-Iterate Convergence Rates for Co-Hypomonotone Inclusions
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q6425841)