The fast Monte-Carlo cross-validation and \(C_L\) procedures: Comments, new results and application to image recovery problems. (With discussions) (Q1965983)

From MaRDI portal
scientific article
Language Label Description Also known as
English
The fast Monte-Carlo cross-validation and \(C_L\) procedures: Comments, new results and application to image recovery problems. (With discussions)
scientific article

    Statements

    The fast Monte-Carlo cross-validation and \(C_L\) procedures: Comments, new results and application to image recovery problems. (With discussions) (English)
    0 references
    0 references
    2 March 2000
    0 references
    This article with following discussions develops and illustrates the author's excellent work in the development of fast randomized versions of selection of the smoothing parameter to solve ill-posed inverse problems or to smooth noisy data. Suppose that the indirect observations \(y=(y_1,\dots,y_n)^T\) are described by the simple model \(y=K f^0 +\varepsilon,\) where \(f^0\) is a ``reasonable smooth'' deterministic object to be estimated, \(K\) is a known \(n\times p\) matrix and components of the unknown observation error \(\varepsilon\) are realizations of i.i.d. random variables with zero mean. A well-known principle for estimation of \(f^0\) is the method of regularization which consists of minimizing \(n^{-1}\|K f - y\|^2 + \lambda f^T\Omega f,\) where \(\|\cdot\|\) denotes the Euclidean norm, \(\lambda >0\) is the regularization (smoothing) parameter and the matrix \(\Omega\) stands for the measure of ``roughness'' of \(f.\) The purpose of the present paper is to develop methods of choosing the regularization parameter \(\lambda.\) A brief review of some classical methods (residual method, Mallow's \(C_L\) and generalized cross-validation \(GCV\) methods) is presented. The author proposes the randomized versions of the \(C_L\) and \(GCV\) methods which are preferable for large problems like image restoration. The convergence properties of the randomized methods are comparable with the exact algorithms. Moreover, the non-asymptotic properties of \(C_L\) and \(GCV\) may be extended to the randomized versions. Numerical examples of surface estimation from data on a bidimensional grid show the advantage of the proposed methods. In his comment, ibid., 233-234, \textit{P. Burman} mentions two alternative approaches to the problem under discussion: \(v\)-fold cross validation and repeated learning-testing. \textit{F. Godtliebsen} and \textit{H. Rue} [ibid., 235-238] underline the restricted character of the linear estimators and propose to use various kernel estimators for image restoration problems. \textit{H.M. Hudson} and \textit{T.C.M. Lee} [ibid., 239-241] concentrate attention on the specific case of Poisson distributed data. \textit{J. Kay} [ibid., 243-248; see also the Errata, ibid. 11, No.1, 87-90 (1996)] proposes to asses the performance of smoothing parameter selectors in terms of estimation error instead of prediction error and introduces two new randomized \(\lambda\) selectors. \textit{G. Wahba} [ibid., 249-250] underlines that a number of simulation studies that compare the exact \(C_L\) and \(GCV\) methods with the randomized versions gave excellent results. In his rejoinder, ibid., 251-258, the author{} mentions further extensions of the methods.
    0 references
    0 references
    0 references
    0 references
    0 references
    linear inverse problems
    0 references
    image reconstruction
    0 references
    regularization
    0 references
    generalized cross-validation
    0 references
    CL method
    0 references
    fast randomized algorithms
    0 references