Rates of Convergence for Regression with the Graph Poly-Laplacian

From MaRDI portal
Publication:6409809

DOI10.1007/S43670-023-00075-5arXiv2209.02305OpenAlexW4389057743MaRDI QIDQ6409809FDOQ6409809

Ryan W. Murray, Matthew Thorpe, Nicolás García Trillos

Publication date: 6 September 2022

Abstract: In the (special) smoothing spline problem one considers a variational problem with a quadratic data fidelity penalty and Laplacian regularisation. Higher order regularity can be obtained via replacing the Laplacian regulariser with a poly-Laplacian regulariser. The methodology is readily adapted to graphs and here we consider graph poly-Laplacian regularisation in a fully supervised, non-parametric, noise corrupted, regression problem. In particular, given a dataset xii=1n and a set of noisy labels yii=1nsubsetmathbbR we let un:xii=1nomathbbR be the minimiser of an energy which consists of a data fidelity term and an appropriately scaled graph poly-Laplacian term. When yi=g(xi)+xii, for iid noise xii, and using the geometric random graph, we identify (with high probability) the rate of convergence of un to g in the large data limit noinfty. Furthermore, our rate, up to logarithms, coincides with the known rate of convergence in the usual smoothing spline model.


Full work available at URL: https://doi.org/10.1007/s43670-023-00075-5











This page was built for publication: Rates of Convergence for Regression with the Graph Poly-Laplacian

Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q6409809)