Regularization theory for ill-posed problems. Selected topics (Q357893)

From MaRDI portal
scientific article
Language Label Description Also known as
English
Regularization theory for ill-posed problems. Selected topics
scientific article

    Statements

    Regularization theory for ill-posed problems. Selected topics (English)
    0 references
    0 references
    0 references
    14 August 2013
    0 references
    This text provides a clear treatment of the fundamental regularization methods for linear ill-posed problems in Hilbert spaces. It contains a number of topics that have been developed rather recently and that are not yet covered by other introductory books on ill-posed problems. This includes stochastic noise models, the balancing principle as a generic a posteriori parameter choice strategy, multiparameter regularization, the adaptive choice of the regularization space, and a meta-learning based approach to regularization. The text consists of five chapters. Chapter 1 provides an introduction to some basic concepts for the regularization of inverse problems using some classical examples. It starts with a section on finite difference schemes for numerical differentiation, with the stepsize used as regularization parameter. Special emphasis is put here on the balancing principle as an efficient a posteriori choice strategy which in fact is also introduced in a general framework. The next section is devoted to the regularized summation of orthogonal series based on \( \lambda \)-methods. Here, a priori parameter choices are considered both in a deterministic and in a stochastic setting. The third section deals with the Cauchy problem for elliptic partial differential equations. After a reformulation as an operator equation for determining the unknown Dirichlet data at the inaccessible part of the boundary, regularization by discretization based on projection schemes is considered here, with a selection of the discretization level by the balancing principle. An application to detection of corrosion is then considered. Chapter 2 is concerned with single parameter regularization methods. After summarizing all necessary mathematical tools like the Moore-Penrose generalized inverse and the singular value decomposition, the authors address examples for single parameter regularization methods, e.g., the spectral cut-off method, Tikhonov-Phillips regularization, iterated Tikhonov regularization, and the Landweber iteration. The best possible accuracy of general regularization schemes under source conditions generated by an index function is considered comprehensively then, both for the deterministic and the stochastic noise model, followed by a section on the qualification of a regularization scheme. The subsequent sections of Chapter 2 present material, e.g., on regularization in Hilbert scales, functional strategies, the regularizing properties of projection methods, and the problem of model selection, i.e., an adequate choice of finite linear spaces for the approximations. Chapter 3 deals with multiparameter regularization which in fact means multiple penalty Tikhonov-type regularization. It starts with the analysis of a discrepancy principle for a two-parameter Tikhonov-type method, including a model function approach used for the numerical implementation. Extensions to multiparameter versions as well as comprehensive numerical tests are considered then. The chapter concludes with a two-parameter Tikhonov-type method for linear ill-posed problems with perturbed operators. The analysis in this chapter is done in a deterministic framework. In Chapter 4 the relationship between learning theory and regularization of ill-posed problems is discussed. Learning theory here means predicting the output of a system under study on the basis of a finite set of input-output pairs observed from the same system. The associated fitting function is generated by one-parameter regularization families, where the involved operators are related to appropriate reproducing kernel Hilbert spaces. As parameter choice, the balancing principle is considered here again. The chapter ends with a section on multiparameter regularization in learning theory. Finally, the focus of Chapter 5 is blood glucose prediction as a case study for meta-learning. This monograph provides a very readable introduction to the regularization of linear ill-posed problems, with an emphasis on topics that are not yet covered by other introductory books on ill-posed problems. It requires knowledge of functional analysis and stochastics on a basic level only. Quite a number of applications are presented, including graphical illustrations and numerical results. This excellent text will be of interest to experts in the field as well as graduate students.
    0 references
    0 references
    inverse problem
    0 references
    ill-posed problem
    0 references
    regularization method
    0 references
    elliptic Cauchy problem
    0 references
    balancing principle
    0 references
    discrepancy principle
    0 references
    summation method
    0 references
    \( \lambda \)-method
    0 references
    FĂ©jer method
    0 references
    stochastic noise model
    0 references
    deterministic noise model
    0 references
    projection method
    0 references
    source condition
    0 references
    singular value decomposition
    0 references
    Picard criterion
    0 references
    Moore-Penrose generalized inverse
    0 references
    single parameter regularization
    0 references
    Tikhonov-Phillips regularization
    0 references
    iterated Tikhonov regularization
    0 references
    Landweber iteration
    0 references
    spectral cut-off method
    0 references
    qualification of a regularization scheme
    0 references
    Hilbert scale
    0 references
    deconvolution
    0 references
    Abel integral equation of the first kind
    0 references
    multiparameter regularization
    0 references
    meta-learning
    0 references
    learning theory
    0 references
    reproducing kernel Hilbert space
    0 references
    blood glucose prediction
    0 references

    Identifiers

    0 references
    0 references
    0 references
    0 references
    0 references
    0 references
    0 references
    0 references