Multivariate nonparametric regression by least squares Jacobi polynomials approximations
From MaRDI portal
Publication:6389983
arXiv2202.01283MaRDI QIDQ6389983FDOQ6389983
Authors: Asma Ben Saber, Sophie Dabo-Niang, Abderrazek Karoui
Publication date: 2 February 2022
Abstract: In this work, we study a random orthogonal projection based least squares estimator for the stable solution of a multivariate nonparametric regression (MNPR) problem. More precisely, given an integer corresponding to the dimension of the MNPR problem, a positive integer and a real parameter we show that a fairly large class of variate regression functions are well and stably approximated by its random projection over the orthonormal set of tensor product variate Jacobi polynomials with parameters The associated uni-variate Jacobi polynomials have degree at most and their tensor products are orthonormal over with respect to the associated multivariate Jacobi weights. In particular, if we consider random sampling points following the variate Beta distribution, with parameters then we give a relation involving to ensure that the resulting random projection matrix is well conditioned. Moreover, we provide squared integrated as well as risk errors of this estimator. Precise estimates of these errors are given in the case where the regression function belongs to an isotropic Sobolev space with Also, to handle the general and practical case of an unknown distribution of the we use Shepard's scattered interpolation scheme in order to generate fairly precise approximations of the observed data at i.i.d. sampling points following a variate Beta distribution. Finally, we illustrate the performance of our proposed multivariate nonparametric estimator by some numerical simulations with synthetic as well as real data.
This page was built for publication: Multivariate nonparametric regression by least squares Jacobi polynomials approximations
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q6389983)