Regularization theory for ill-posed problems. Selected topics (Q357893): Difference between revisions

From MaRDI portal
Importer (talk | contribs)
Created a new Item
 
Set OpenAlex properties.
 
(6 intermediate revisions by 5 users not shown)
Property / author
 
Property / author: Sergei V. Pereverzyev / rank
Normal rank
 
Property / author
 
Property / author: Sergei V. Pereverzyev / rank
 
Normal rank
Property / review text
 
This text provides a clear treatment of the fundamental regularization methods for linear ill-posed problems in Hilbert spaces. It contains a number of topics that have been developed rather recently and that are not yet covered by other introductory books on ill-posed problems. This includes stochastic noise models, the balancing principle as a generic a posteriori parameter choice strategy, multiparameter regularization, the adaptive choice of the regularization space, and a meta-learning based approach to regularization. The text consists of five chapters. Chapter 1 provides an introduction to some basic concepts for the regularization of inverse problems using some classical examples. It starts with a section on finite difference schemes for numerical differentiation, with the stepsize used as regularization parameter. Special emphasis is put here on the balancing principle as an efficient a posteriori choice strategy which in fact is also introduced in a general framework. The next section is devoted to the regularized summation of orthogonal series based on \( \lambda \)-methods. Here, a priori parameter choices are considered both in a deterministic and in a stochastic setting. The third section deals with the Cauchy problem for elliptic partial differential equations. After a reformulation as an operator equation for determining the unknown Dirichlet data at the inaccessible part of the boundary, regularization by discretization based on projection schemes is considered here, with a selection of the discretization level by the balancing principle. An application to detection of corrosion is then considered. Chapter 2 is concerned with single parameter regularization methods. After summarizing all necessary mathematical tools like the Moore-Penrose generalized inverse and the singular value decomposition, the authors address examples for single parameter regularization methods, e.g., the spectral cut-off method, Tikhonov-Phillips regularization, iterated Tikhonov regularization, and the Landweber iteration. The best possible accuracy of general regularization schemes under source conditions generated by an index function is considered comprehensively then, both for the deterministic and the stochastic noise model, followed by a section on the qualification of a regularization scheme. The subsequent sections of Chapter 2 present material, e.g., on regularization in Hilbert scales, functional strategies, the regularizing properties of projection methods, and the problem of model selection, i.e., an adequate choice of finite linear spaces for the approximations. Chapter 3 deals with multiparameter regularization which in fact means multiple penalty Tikhonov-type regularization. It starts with the analysis of a discrepancy principle for a two-parameter Tikhonov-type method, including a model function approach used for the numerical implementation. Extensions to multiparameter versions as well as comprehensive numerical tests are considered then. The chapter concludes with a two-parameter Tikhonov-type method for linear ill-posed problems with perturbed operators. The analysis in this chapter is done in a deterministic framework. In Chapter 4 the relationship between learning theory and regularization of ill-posed problems is discussed. Learning theory here means predicting the output of a system under study on the basis of a finite set of input-output pairs observed from the same system. The associated fitting function is generated by one-parameter regularization families, where the involved operators are related to appropriate reproducing kernel Hilbert spaces. As parameter choice, the balancing principle is considered here again. The chapter ends with a section on multiparameter regularization in learning theory. Finally, the focus of Chapter 5 is blood glucose prediction as a case study for meta-learning. This monograph provides a very readable introduction to the regularization of linear ill-posed problems, with an emphasis on topics that are not yet covered by other introductory books on ill-posed problems. It requires knowledge of functional analysis and stochastics on a basic level only. Quite a number of applications are presented, including graphical illustrations and numerical results. This excellent text will be of interest to experts in the field as well as graduate students.
Property / review text: This text provides a clear treatment of the fundamental regularization methods for linear ill-posed problems in Hilbert spaces. It contains a number of topics that have been developed rather recently and that are not yet covered by other introductory books on ill-posed problems. This includes stochastic noise models, the balancing principle as a generic a posteriori parameter choice strategy, multiparameter regularization, the adaptive choice of the regularization space, and a meta-learning based approach to regularization. The text consists of five chapters. Chapter 1 provides an introduction to some basic concepts for the regularization of inverse problems using some classical examples. It starts with a section on finite difference schemes for numerical differentiation, with the stepsize used as regularization parameter. Special emphasis is put here on the balancing principle as an efficient a posteriori choice strategy which in fact is also introduced in a general framework. The next section is devoted to the regularized summation of orthogonal series based on \( \lambda \)-methods. Here, a priori parameter choices are considered both in a deterministic and in a stochastic setting. The third section deals with the Cauchy problem for elliptic partial differential equations. After a reformulation as an operator equation for determining the unknown Dirichlet data at the inaccessible part of the boundary, regularization by discretization based on projection schemes is considered here, with a selection of the discretization level by the balancing principle. An application to detection of corrosion is then considered. Chapter 2 is concerned with single parameter regularization methods. After summarizing all necessary mathematical tools like the Moore-Penrose generalized inverse and the singular value decomposition, the authors address examples for single parameter regularization methods, e.g., the spectral cut-off method, Tikhonov-Phillips regularization, iterated Tikhonov regularization, and the Landweber iteration. The best possible accuracy of general regularization schemes under source conditions generated by an index function is considered comprehensively then, both for the deterministic and the stochastic noise model, followed by a section on the qualification of a regularization scheme. The subsequent sections of Chapter 2 present material, e.g., on regularization in Hilbert scales, functional strategies, the regularizing properties of projection methods, and the problem of model selection, i.e., an adequate choice of finite linear spaces for the approximations. Chapter 3 deals with multiparameter regularization which in fact means multiple penalty Tikhonov-type regularization. It starts with the analysis of a discrepancy principle for a two-parameter Tikhonov-type method, including a model function approach used for the numerical implementation. Extensions to multiparameter versions as well as comprehensive numerical tests are considered then. The chapter concludes with a two-parameter Tikhonov-type method for linear ill-posed problems with perturbed operators. The analysis in this chapter is done in a deterministic framework. In Chapter 4 the relationship between learning theory and regularization of ill-posed problems is discussed. Learning theory here means predicting the output of a system under study on the basis of a finite set of input-output pairs observed from the same system. The associated fitting function is generated by one-parameter regularization families, where the involved operators are related to appropriate reproducing kernel Hilbert spaces. As parameter choice, the balancing principle is considered here again. The chapter ends with a section on multiparameter regularization in learning theory. Finally, the focus of Chapter 5 is blood glucose prediction as a case study for meta-learning. This monograph provides a very readable introduction to the regularization of linear ill-posed problems, with an emphasis on topics that are not yet covered by other introductory books on ill-posed problems. It requires knowledge of functional analysis and stochastics on a basic level only. Quite a number of applications are presented, including graphical illustrations and numerical results. This excellent text will be of interest to experts in the field as well as graduate students. / rank
 
Normal rank
Property / reviewed by
 
Property / reviewed by: Robert Plato / rank
 
Normal rank
Property / Mathematics Subject Classification ID
 
Property / Mathematics Subject Classification ID: 47-02 / rank
 
Normal rank
Property / Mathematics Subject Classification ID
 
Property / Mathematics Subject Classification ID: 47A52 / rank
 
Normal rank
Property / Mathematics Subject Classification ID
 
Property / Mathematics Subject Classification ID: 65J20 / rank
 
Normal rank
Property / Mathematics Subject Classification ID
 
Property / Mathematics Subject Classification ID: 65J22 / rank
 
Normal rank
Property / Mathematics Subject Classification ID
 
Property / Mathematics Subject Classification ID: 93E35 / rank
 
Normal rank
Property / zbMATH DE Number
 
Property / zbMATH DE Number: 6198405 / rank
 
Normal rank
Property / zbMATH Keywords
 
inverse problem
Property / zbMATH Keywords: inverse problem / rank
 
Normal rank
Property / zbMATH Keywords
 
ill-posed problem
Property / zbMATH Keywords: ill-posed problem / rank
 
Normal rank
Property / zbMATH Keywords
 
regularization method
Property / zbMATH Keywords: regularization method / rank
 
Normal rank
Property / zbMATH Keywords
 
elliptic Cauchy problem
Property / zbMATH Keywords: elliptic Cauchy problem / rank
 
Normal rank
Property / zbMATH Keywords
 
balancing principle
Property / zbMATH Keywords: balancing principle / rank
 
Normal rank
Property / zbMATH Keywords
 
discrepancy principle
Property / zbMATH Keywords: discrepancy principle / rank
 
Normal rank
Property / zbMATH Keywords
 
summation method
Property / zbMATH Keywords: summation method / rank
 
Normal rank
Property / zbMATH Keywords
 
\( \lambda \)-method
Property / zbMATH Keywords: \( \lambda \)-method / rank
 
Normal rank
Property / zbMATH Keywords
 
Féjer method
Property / zbMATH Keywords: Féjer method / rank
 
Normal rank
Property / zbMATH Keywords
 
stochastic noise model
Property / zbMATH Keywords: stochastic noise model / rank
 
Normal rank
Property / zbMATH Keywords
 
deterministic noise model
Property / zbMATH Keywords: deterministic noise model / rank
 
Normal rank
Property / zbMATH Keywords
 
projection method
Property / zbMATH Keywords: projection method / rank
 
Normal rank
Property / zbMATH Keywords
 
source condition
Property / zbMATH Keywords: source condition / rank
 
Normal rank
Property / zbMATH Keywords
 
singular value decomposition
Property / zbMATH Keywords: singular value decomposition / rank
 
Normal rank
Property / zbMATH Keywords
 
Picard criterion
Property / zbMATH Keywords: Picard criterion / rank
 
Normal rank
Property / zbMATH Keywords
 
Moore-Penrose generalized inverse
Property / zbMATH Keywords: Moore-Penrose generalized inverse / rank
 
Normal rank
Property / zbMATH Keywords
 
single parameter regularization
Property / zbMATH Keywords: single parameter regularization / rank
 
Normal rank
Property / zbMATH Keywords
 
Tikhonov-Phillips regularization
Property / zbMATH Keywords: Tikhonov-Phillips regularization / rank
 
Normal rank
Property / zbMATH Keywords
 
iterated Tikhonov regularization
Property / zbMATH Keywords: iterated Tikhonov regularization / rank
 
Normal rank
Property / zbMATH Keywords
 
Landweber iteration
Property / zbMATH Keywords: Landweber iteration / rank
 
Normal rank
Property / zbMATH Keywords
 
spectral cut-off method
Property / zbMATH Keywords: spectral cut-off method / rank
 
Normal rank
Property / zbMATH Keywords
 
qualification of a regularization scheme
Property / zbMATH Keywords: qualification of a regularization scheme / rank
 
Normal rank
Property / zbMATH Keywords
 
Hilbert scale
Property / zbMATH Keywords: Hilbert scale / rank
 
Normal rank
Property / zbMATH Keywords
 
deconvolution
Property / zbMATH Keywords: deconvolution / rank
 
Normal rank
Property / zbMATH Keywords
 
Abel integral equation of the first kind
Property / zbMATH Keywords: Abel integral equation of the first kind / rank
 
Normal rank
Property / zbMATH Keywords
 
multiparameter regularization
Property / zbMATH Keywords: multiparameter regularization / rank
 
Normal rank
Property / zbMATH Keywords
 
meta-learning
Property / zbMATH Keywords: meta-learning / rank
 
Normal rank
Property / zbMATH Keywords
 
learning theory
Property / zbMATH Keywords: learning theory / rank
 
Normal rank
Property / zbMATH Keywords
 
reproducing kernel Hilbert space
Property / zbMATH Keywords: reproducing kernel Hilbert space / rank
 
Normal rank
Property / zbMATH Keywords
 
blood glucose prediction
Property / zbMATH Keywords: blood glucose prediction / rank
 
Normal rank
Property / describes a project that uses
 
Property / describes a project that uses: Regularization tools / rank
 
Normal rank
Property / MaRDI profile type
 
Property / MaRDI profile type: MaRDI publication profile / rank
 
Normal rank
Property / full work available at URL
 
Property / full work available at URL: https://doi.org/10.1515/9783110286496 / rank
 
Normal rank
Property / OpenAlex ID
 
Property / OpenAlex ID: W4247295697 / rank
 
Normal rank
links / mardi / namelinks / mardi / name
 

Latest revision as of 19:40, 19 March 2024

scientific article
Language Label Description Also known as
English
Regularization theory for ill-posed problems. Selected topics
scientific article

    Statements

    Regularization theory for ill-posed problems. Selected topics (English)
    0 references
    0 references
    0 references
    14 August 2013
    0 references
    This text provides a clear treatment of the fundamental regularization methods for linear ill-posed problems in Hilbert spaces. It contains a number of topics that have been developed rather recently and that are not yet covered by other introductory books on ill-posed problems. This includes stochastic noise models, the balancing principle as a generic a posteriori parameter choice strategy, multiparameter regularization, the adaptive choice of the regularization space, and a meta-learning based approach to regularization. The text consists of five chapters. Chapter 1 provides an introduction to some basic concepts for the regularization of inverse problems using some classical examples. It starts with a section on finite difference schemes for numerical differentiation, with the stepsize used as regularization parameter. Special emphasis is put here on the balancing principle as an efficient a posteriori choice strategy which in fact is also introduced in a general framework. The next section is devoted to the regularized summation of orthogonal series based on \( \lambda \)-methods. Here, a priori parameter choices are considered both in a deterministic and in a stochastic setting. The third section deals with the Cauchy problem for elliptic partial differential equations. After a reformulation as an operator equation for determining the unknown Dirichlet data at the inaccessible part of the boundary, regularization by discretization based on projection schemes is considered here, with a selection of the discretization level by the balancing principle. An application to detection of corrosion is then considered. Chapter 2 is concerned with single parameter regularization methods. After summarizing all necessary mathematical tools like the Moore-Penrose generalized inverse and the singular value decomposition, the authors address examples for single parameter regularization methods, e.g., the spectral cut-off method, Tikhonov-Phillips regularization, iterated Tikhonov regularization, and the Landweber iteration. The best possible accuracy of general regularization schemes under source conditions generated by an index function is considered comprehensively then, both for the deterministic and the stochastic noise model, followed by a section on the qualification of a regularization scheme. The subsequent sections of Chapter 2 present material, e.g., on regularization in Hilbert scales, functional strategies, the regularizing properties of projection methods, and the problem of model selection, i.e., an adequate choice of finite linear spaces for the approximations. Chapter 3 deals with multiparameter regularization which in fact means multiple penalty Tikhonov-type regularization. It starts with the analysis of a discrepancy principle for a two-parameter Tikhonov-type method, including a model function approach used for the numerical implementation. Extensions to multiparameter versions as well as comprehensive numerical tests are considered then. The chapter concludes with a two-parameter Tikhonov-type method for linear ill-posed problems with perturbed operators. The analysis in this chapter is done in a deterministic framework. In Chapter 4 the relationship between learning theory and regularization of ill-posed problems is discussed. Learning theory here means predicting the output of a system under study on the basis of a finite set of input-output pairs observed from the same system. The associated fitting function is generated by one-parameter regularization families, where the involved operators are related to appropriate reproducing kernel Hilbert spaces. As parameter choice, the balancing principle is considered here again. The chapter ends with a section on multiparameter regularization in learning theory. Finally, the focus of Chapter 5 is blood glucose prediction as a case study for meta-learning. This monograph provides a very readable introduction to the regularization of linear ill-posed problems, with an emphasis on topics that are not yet covered by other introductory books on ill-posed problems. It requires knowledge of functional analysis and stochastics on a basic level only. Quite a number of applications are presented, including graphical illustrations and numerical results. This excellent text will be of interest to experts in the field as well as graduate students.
    0 references
    0 references
    inverse problem
    0 references
    ill-posed problem
    0 references
    regularization method
    0 references
    elliptic Cauchy problem
    0 references
    balancing principle
    0 references
    discrepancy principle
    0 references
    summation method
    0 references
    \( \lambda \)-method
    0 references
    Féjer method
    0 references
    stochastic noise model
    0 references
    deterministic noise model
    0 references
    projection method
    0 references
    source condition
    0 references
    singular value decomposition
    0 references
    Picard criterion
    0 references
    Moore-Penrose generalized inverse
    0 references
    single parameter regularization
    0 references
    Tikhonov-Phillips regularization
    0 references
    iterated Tikhonov regularization
    0 references
    Landweber iteration
    0 references
    spectral cut-off method
    0 references
    qualification of a regularization scheme
    0 references
    Hilbert scale
    0 references
    deconvolution
    0 references
    Abel integral equation of the first kind
    0 references
    multiparameter regularization
    0 references
    meta-learning
    0 references
    learning theory
    0 references
    reproducing kernel Hilbert space
    0 references
    blood glucose prediction
    0 references

    Identifiers

    0 references
    0 references
    0 references
    0 references
    0 references
    0 references
    0 references
    0 references