Minimax theory of image reconstruction (Q690327)

From MaRDI portal
Revision as of 09:40, 30 January 2024 by Import240129110113 (talk | contribs) (Added link to MaRDI item.)
scientific article
Language Label Description Also known as
English
Minimax theory of image reconstruction
scientific article

    Statements

    Minimax theory of image reconstruction (English)
    0 references
    21 November 1993
    0 references
    The main purpose of the book is to develop the tools of comparing different image and edge estimators applying the asymptotic minimax approach based on ideas from nonparametric regression and change-point theory. Let us introduce the statistical model of noisy image. Consider \(K = [0,1] \times [0,1]\), \(f : K \to R\); the sample of observations \(X^{(n)} = ((X_1, Y_1), \dots, (X_n, Y_n))\) satisfies \(Y_i = f(X_i) + \xi_i\), \(i = 1,\dots, n\), where \(X_1, \dots, X_n\), the design points, belong to \(K\) and can be fixed or random; \(\xi_1,\dots, \xi_n\) are i.i.d. random variables; and the vectors \((X_1, \dots, X_n)\), \((\xi_1, \dots, \xi_n)\) are independent. Binary image: there exists \(G \subseteq K\) such that \(f(x) = 1_G(x) = 1\) if \(x \in G\) and 0 if \(x \in G^c = K\setminus G\); grey-scale image: there exist \(G \subseteq K\), \(f_0\) and \(f_1\) real-valued functions defined on \(K\), and \(a\), \(b\) real numbers, such that \(f(x) = f_0(x)\) if \(x \in G^c\), \(f(x) = f_1(x)\) if \(x \in G\), and \(f_0(x) \leq a < b \leq f_1(x)\) for all \(x \in K\). We mean by image estimation the estimation of grey-scale images \(f\) from observations, and by edge estimation the estimation of \(G\) and \(\Gamma =\partial G\) (the boundary of \(G\)) for binary or grey-scale images. We say that the positive sequence \((\Psi_n)\) is minimax rate of convergence for edge estimation if there exist \(0 < p_0 < 1\) and \(C > 0\) such that \[ \liminf_{n \to \infty} \inf_{G^\wedge (n)} \sup_{G \in {\mathcal G}} P_G (\Psi_n^{-1} d(G,G^\wedge (n)) \geq C) \geq p_0, \] where \(\inf_{G^\wedge (n)}\) is the infinimum over all edge estimators under consideration, \(P_G\) is the distribution of \(X^{(n)}\) under \(f = 1_G\), \({\mathcal G}\) is a class of compact subsets of \(K\), and \(d\) is a distance between compact sets (the Hausdorff distance or the measure of symmetric difference, for example). A sequence of edge estimators \((G^*(n))\) is called an optimal edge estimator if \[ \lim_{C \to \infty} \lim_{n \to \infty} \sup_{G\in {\mathcal G}} P_G (\Psi^{- 1}_n d(G,G^*(n)) \geq C) = 0. \] For image estimation problems the authors use only the squared loss function and the \(L^2\)-norm. \((\Psi_n)\) is minimax rate of convergence for the class of images \(\Phi\) if for any \(n\) large enough \[ C_0 \leq \inf_{T^\wedge (n)} \sup_{f \in \Phi} \Psi^{-2}_n E_f(\int_K [f(x) - T^\wedge(n) (x)]^2 dx) \leq C_1, \] with some positive \(C_0\), \(C_1\). Here \(\Phi\) is a class of images, \(\inf_{T^\wedge (n)}\) is the infimum over all image estimators under consideration, and \(E_f\) is the expectation under \(f\). A sequence of image estimators \((T^* (n))\) is called an optimal image estimator if for any \(n\) \[ \sup_{f\in \Phi} \Psi^{-2}_n E_f(\int_K [f(x) - T^* (n) (x)]^2 dx) \leq C_1. \] In Chapters 1 and 2 a general and concise introduction to nonparametric estimation theory is presented. The two basic problems of nonparametric regression and nonparametric change-point are discussed in Chapter 1. A simple introduction to the subject is given in a self- sufficient form. For nonparametric regression a class of locally- polynomial estimators (LPE) which contains the kernel estimator and the class of piecewise-polynomial estimators (PPE) are studied. For the change-point problem the maximum likelihood estimator is considered. The results proved in the chapter are related mainly to the convergence rates of the estimators. Chapter 2 is devoted to minimax lower bounds for arbitrary estimators in general statistical models. Chapters 3 to 9 contain mostly the new results, and the reader who is familiar with nonparametric estimation background may proceed to them directly. In Chapter 3 the image models above and a minimax nonparametric framework for image and edge estimation based on similar ideas exposed in the preceding chapters is introduced. In Chapter 4 optimal image and edge estimators are defined. For edge estimation a modified version of PPE is used, while for image estimation the two-dimensional LPE is proposed. Only the case of boundary fragments in Gaussian noise is considered. The results of Chapters 3 and 4 can be generalized for the images defined on \(N\)-dimensional cube \([0,1]^N\), \(N > 2\). Also assumptions on the noise distribution can be relaxed. These points are the subject of Chapter 5. Along with the \(N\)-dimensional extension of the model above a multiplicative noise model for binary boundary fragments is also considered in Chapter 5. The multiplicative noise model could be of interest for problems in Synthetic Aperture Radar images processing. Chapter 6 deals with the simplified image reconstruction procedures, namely with linewise and linear processing. The main questions considered are: whether these procedures ensure the minimax rate of convergence and, if not, what is the best possible rate within these classes? In Chapters 7 to 9 some further issues related to the basic image reconstruction problem are discussed; namely: the estimation of support of a density, the estimation of image functionals such as the domain's area, image estimation from indirect observations (i.e. observations of transformed values of the image \(f\)), the stochastic tomography setup. For all these problems the minimax rates of convergence are derived and optimal estimators are constructed. One remarkable point raised in this book is that the choice of design is important in image analysis; some randomized designs allow to improve substantially the accuracy of estimation as compared to the regular grid design. The authors say that their attitude in this book was to prove the results under the simplest assumptions which still allow to keep the main features of a particular problem. Thus, they often assume that the random errors are independent identically distributed Gaussian, and generalizations are given without proofs. There are no practical application examples but the potential usefulness of models studied is indicated. The list of references includes ninety papers and books on the subject.
    0 references
    0 references
    binary image
    0 references
    nonparametric regression
    0 references
    change-point theory
    0 references
    noisy image
    0 references
    image estimation
    0 references
    grey-scale images
    0 references
    edge estimation
    0 references
    minimax rate of convergence
    0 references
    optimal edge estimator
    0 references
    squared loss function
    0 references
    locally- polynomial estimators
    0 references
    kernel estimator
    0 references
    piecewise-polynomial estimators
    0 references
    maximum likelihood estimator
    0 references
    convergence rates
    0 references
    minimax lower bounds
    0 references
    Gaussian noise
    0 references
    multiplicative noise model
    0 references
    binary boundary fragments
    0 references
    image reconstruction
    0 references
    estimation of support
    0 references
    estimation of image functionals
    0 references
    indirect observations
    0 references
    stochastic tomography
    0 references
    randomized designs
    0 references

    Identifiers

    0 references
    0 references
    0 references
    0 references
    0 references
    0 references
    0 references