Multistep approximation algorithms: Improved convergence rates through postconditioning with smoothing kernels (Q1284137): Difference between revisions
From MaRDI portal
Created a new Item |
Set profile property. |
||
(One intermediate revision by one other user not shown) | |||
Property / MaRDI profile type | |||
Property / MaRDI profile type: MaRDI publication profile / rank | |||
Normal rank | |||
links / mardi / name | links / mardi / name | ||
Latest revision as of 02:48, 5 March 2024
scientific article
Language | Label | Description | Also known as |
---|---|---|---|
English | Multistep approximation algorithms: Improved convergence rates through postconditioning with smoothing kernels |
scientific article |
Statements
Multistep approximation algorithms: Improved convergence rates through postconditioning with smoothing kernels (English)
0 references
1 November 1999
0 references
Certain widely used multistep approximation algorithms are interpreted as instances of an approximate Newton method. In an earlier paper [\textit{J. W. Jerome}, Numer. Math. 47, 123-138 (1985; Zbl 0579.65046)] the second author showed that the convergence rates of approximate Newton methods suffer from ``loss of derivative'', and that the subsequent linear rate of convergence can be improved to be superlinear using an adaptation of Nash-Moser iteration for numerical analysis purposes. The essence of this adaptation is a splitting of the inversion and the smoothing into two separate steps. In the present paper the authors apply these ideas to scattered data approximation as well as the numerical solution of partial differential equations. Several radial kernels for the smoothing operation are investigated . Results of Hörmander and the second author are generalized to Sobolev and Besov spaces. The resulting theory provides convergence results for certain multilevel approximation algorithms. The established convergence rate is superlinear if a smoothing operation is included in the algorithm as a postconditioner. Numerical examples conclude the paper suggesting the power of the theory.
0 references
multistep approximation
0 references
multilevel interpolation
0 references
Newton iteration
0 references
smoothing kernel
0 references
postconditioning
0 references
algorithms
0 references
convergence
0 references
Nash-Moser iteration
0 references
scattered data approximation
0 references
numerical examples
0 references