Large Sample Properties of Partitioning-Based Series Estimators (Q143957)

From MaRDI portal
scientific article; zbMATH DE number 7241609
  • Large sample properties of partitioning-based series estimators
Language Label Description Also known as
English
Large Sample Properties of Partitioning-Based Series Estimators
scientific article; zbMATH DE number 7241609
  • Large sample properties of partitioning-based series estimators

Statements

13 April 2018
0 references
28 August 2020
0 references
0 references
math.ST
0 references
econ.EM
0 references
stat.TH
0 references
0 references
0 references
0 references
0 references
0 references
0 references
Large sample properties of partitioning-based series estimators (English)
0 references
The authors study nonparametric regression problems for univariate responses \(y_1, \ldots, y_n\) and \(\mathbb{R}^d\)-valued, continuously distributed covariates \(\mathbf{x}_1, \ldots, \mathbf{x}_n\), where the latter are supported on the compact set \(\mathcal{X}\). In this, the object of interest is the (mean) regression function \(\mu(\cdot)\), such that \(\mu(\mathbf{x}) = \mathbb{E}[y | \mathbf{x}]\). The authors consider partitioning-based series least squares estimators (LSEs) for \(\mu(\cdot)\) and its derivatives, meaning that \(\mathcal{X}\) is partitioned into non-overlapping cells, on which basis functions are defined. Examples are spline bases, compactly supported wavelet bases, and piecewise polynomial bases. First, the (asymptotic) bias of the LSE is characterized by means of its leading term. Based on this, three bias correction methods are derived. Second, the performance of the LSE is analyzed in terms of the asymptotic behaviour of its integrated mean squared error. Third, pointwise (for fixed \(\mathbf{x}\)) and uniform (over \(\mathcal{X}\)) inference methods are elaborated upon in terms of central limit theorems and strong approximations, respectively, based on undersmoothing and robust bias correction. For practical purposes, the authors also propose ways of feasible tuning parameter selection, and they illustrate their theoretical findings by means of Monte Carlo simulations.
0 references
0 references
0 references
0 references
0 references
0 references
0 references
nonparametric regression
0 references
robust bias correction
0 references
sieve methods
0 references
strong approximation
0 references
tuning parameter selection
0 references
0 references
0 references
0 references
0 references
0 references
0 references
0 references
0 references
0 references