A strongly convergent hybrid proximal method in Banach spaces. (Q1425146)
From MaRDI portal
scientific article
Language | Label | Description | Also known as |
---|---|---|---|
English | A strongly convergent hybrid proximal method in Banach spaces. |
scientific article |
Statements
A strongly convergent hybrid proximal method in Banach spaces. (English)
0 references
15 March 2004
0 references
When extended to a (not necessarily Hilbertian) reflexive Banach space \(B\) setting, the Martinet-Rockafellar proximal point method of finding zeros for a (set-valued) maximal monotone operator \(T\) is the following procedure of generating a sequence \(\{x^k\}_{k\in\mathbb{N}}\) starting form an arbitrary pint \(x^0\in B\): \[ \text{Given }x^k\text{ define }x^{k+1}\text{ by }0\in Tx^{k+1}+ \lambda_k(f'(x^{k+1})- f'(x^k)), \] where \(\{\lambda_k\}_{k\in\mathbb{N}}\) is a bounded sequence of positive real numbers and \(f\) is a differentiable totally convex function on \(B\) [cf. \textit{D. Butnariu} and \textit{A. N. Iusem}, Numer. Funct. Anal. Optim. 18, No. 7--8, 723--744 (1997; Zbl 0891.49002)] and \textit{R. S. Burachik} and \textit{S. Scheimberg}, SIAM J. Control Optim. 39, No. 5, 1633--1649 (2001; Zbl 0988.90045)]. Under undemanding conditions, sequences generated according to this procedure are bounded and converge subsequentially weakly to points \(x\) such that \(0\in Tx\). Weak convergence of the entire sequence is happening under quite demanding conditions on \(f\) and, implicitly on the geometric structure of \(B\). Even if weak convergence occurs, strong convergence may not happen. In [Math. Program. 87A, No. 1, 189--202 (2000; Zbl 0971.90062)] \textit{M. V. Solodov} and \textit{B. F. Svaiter}, show an elegant way of modifying that procedure in order to obtain strong convergence of the generated sequences. In the paper under review, the authors follow the same basic idea of modifying the procedure in a non-Hilbertian Banach space context, in order to produce a strongly convergent proximal point like method which is remarkably stable under computational errors.
0 references
proximal point method
0 references
totally convex function
0 references
strong convergence
0 references
relative error
0 references
inexact solutions
0 references
hybrid steps
0 references
enlargement of maximal monotone operators
0 references
0 references
0 references
0 references
0 references