Projected gradient iteration for nonlinear operator equation (Q633958)
From MaRDI portal
scientific article
Language | Label | Description | Also known as |
---|---|---|---|
English | Projected gradient iteration for nonlinear operator equation |
scientific article |
Statements
Projected gradient iteration for nonlinear operator equation (English)
0 references
2 August 2011
0 references
The article deals with the following iterative method \[ x^{(n+1)} = \text{arg} \, \min_{w \in B_{R,p}} \;F_{\alpha^{(n)}}(w,x^{(n)}), \] where \(F_\alpha(w,x) = \Delta(w) - \|F(w) - F(x)\|_H^2 + \frac1\alpha \, \|w - x\|_2^2\), \(\Delta(x) = \|AS^*(x) - y\|_H^2\), for solving a nonlinear equation \(A(f) = y\), where \(A\) is a possibly ill-posed operator from \(X\) to \(H\). The operator \(F\) is defined as \(F = AS^*\), where \(S^*\) is some special operator from \(\ell_2({\mathbb Z}) \to X\). The main theorem describes conditions under which, for a weak accumulation point of \((x^{(n)})\), there exists a subsequence \((x^{(n_j)})\) converging to this accumulation point. In the end of the article the equation \[ \int_0^s x(s - t)x(t) \, dt = y(s) \] is considered. It should be remarked that the article is really unreadable and written in an awful manner; the authors begin from the equation \(A(f) = y\), then they deal with operator \(F = AS^*\), then \(F\) is simply a nonlinear operator supposedly between Hilbert space \(X\) and \(Y\), but this \(Y\) after some time turns into \(H\) and so on.
0 references
\(\ell_p\)-norm constraint
0 references
shrinkage operator
0 references
projected gradient iteration
0 references
ill-posed nonlinear operator equation
0 references
Fréchet differentiability
0 references
0 references
0 references
0 references
0 references
0 references
0 references