A convergence analysis of a method of steepest descent and a two–step algorothm for nonlinear ill–posed problems
DOI10.1080/01630569608816691zbMath0852.65048OpenAlexW1580825192MaRDI QIDQ4886624
Publication date: 4 December 1996
Published in: Numerical Functional Analysis and Optimization (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1080/01630569608816691
stabilityconvergenceHilbert spacesconjugate gradient methoditerative methodsregularization methodnonlinear ill-posed problemssteepest descent methodtwo-step algorithm
Iterative procedures involving nonlinear operators (47J25) Numerical solutions to equations with nonlinear operators (65J15) Numerical solutions of ill-posed problems in abstract spaces; regularization (65J20)
Related Items (32)
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- The instability of some gradient methods for ill-posed problems
- A minimal error conjugate gradient method for ill-posed problems
- A convergence analysis of the Landweber iteration for nonlinear ill-posed problems
- Well posedness and convergence of some regularisation methods for non-linear ill posed problems
- A class of iterative methods of conjugate gradient type
- Convergence rates for Tikhonov regularisation of non-linear ill-posed problems
- Tikhonov regularisation for non-linear ill-posed problems: optimal convergence rates and finite-dimensional approximation
- Some History of the Conjugate Gradient and Lanczos Algorithms: 1948–1976
- Optimal a Posteriori Parameter Choice for Tikhonov Regularization for Solving Nonlinear Ill-Posed Problems
- Steepest descent for singular linear operators with nonclosed range
- On the Convergence of the Conjugate Gradient Method for Singular Linear Operator Equations
This page was built for publication: A convergence analysis of a method of steepest descent and a two–step algorothm for nonlinear ill–posed problems