Approximation of least squares regression on nested subspaces
From MaRDI portal
Publication:1118945
DOI10.1214/aos/1176350830zbMath0669.62047OpenAlexW2026151926MaRDI QIDQ1118945
Publication date: 1988
Published in: The Annals of Statistics (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1214/aos/1176350830
orthogonal polynomialsmodel selectionrates of convergencepolynomial regressionleast squares estimatornonparametric regressionbias approximationasymptotic design measureconsistency in supremum normFourier series regressionscale of Hilbert normsweighted L2 norms
Asymptotic properties of parametric estimators (62F12) Linear regression; mixed models (62J05) Approximation by polynomials (41A10)
Related Items
Convergence rates and asymptotic normality for series estimators ⋮ Least squares orthogonal polynomial regression estimation for irregular design ⋮ Qualitative and asymptotic performance of SNP density estimators ⋮ On finite-sample properties of adaptive least squares regression estimates ⋮ The asymptotic average squared error for polynomial regression ⋮ Deep nonparametric regression on approximate manifolds: nonasymptotic error bounds with polynomial prefactors ⋮ UNIFORM CONVERGENCE OF SERIES ESTIMATORS OVER FUNCTION SPACES ⋮ Credibility using a loss function from Spline theory ⋮ On the truncated Hausdorff moment problem under Sobolev regularity conditions ⋮ Credibility Using a Loss Function from Spline Theory ⋮ Exponential series estimator of multivariate densities ⋮ What are we estimating when we fit Stevens' power law? ⋮ On generalization in moment-based domain adaptation ⋮ GMM inference when the number of moment conditions in large ⋮ Convergence rates for trigonometric and polynomial-trigonometric regression estimators ⋮ Information-theoretic determination of minimax rates of convergence ⋮ Bayesian curve estimation by polynomial of random order.