An equivalence theorem for \(L_ 1\) convergence of the kernel regression estimate

From MaRDI portal
Publication:1262650

DOI10.1016/0378-3758(89)90040-2zbMath0686.62027OpenAlexW2065086209MaRDI QIDQ1262650

Adam Krzyżak, Luc P. Devroye

Publication date: 1989

Published in: Journal of Statistical Planning and Inference (Search for Journal in Brave)

Full work available at URL: https://doi.org/10.1016/0378-3758(89)90040-2



Related Items

Learning and Convergence of the Normalized Radial Basis Functions Networks, Estimation of the optimal design of a nonlinear parametric regression problem via Monte Carlo experiments, On radial basis function nets and kernel regression: Statistical consistency, convergence rates, and receptive field size, Estimation of a density in a simulation model, Reducing false positives of network anomaly detection by local adaptive multivariate smoothing, Fixed-design regression estimation based on real and artificial data, On the kernel rule for function classification, Nonparametric estimation of a function from noiseless observations at random points, On the consistency of a new kernel rule for spatially dependent data, On nonparametric classification for weakly dependent functional processes, Optimal global rates of convergence for interpolation problems with random design, Nonparametric discrimination of areal functional data, Estimation of a jump point in random design regression, On Learning and Convergence of RBF Networks in Regression Estimation and Classification, Nonparametric estimation of non-stationary velocity fields from 3D particle tracking velocimetry data, On the correct regression function (in \(L_{2}\)) and its applications when the dimension of the covariate vector is random, Optimal global rates of convergence for noiseless regression estimation problems with adaptively chosen design, On estimation of surrogate models for multivariate computer experiments, Nonparametric estimation of a maximum of quantiles, Nonparametric quantile estimation using importance sampling, On the \(L_p\) norms of kernel regression estimators for incomplete data with applications to classification, Rate of convergence of the density estimation of regression residual, Classifier performance as a function of distributional complexity, Estimation of extreme quantiles in a simulation model, Kernel classification with missing data and the choice of smoothing parameters, Strong universal consistency of smooth kernel regression estimates, Exponential-Bound Property of Estimators and Variable Selection in Generalized Additive Models, Strong consistency of kernel estimates of regression function under dependence, Regression Estimation from an Individual Stable Sequence, Local averaging estimates of the regression function with twice censored data, On classification with nonignorable missing data, Nonparametric estimation of a latent variable model, Smoothing spline regression estimation based on real and artificial data, Strongly consistent nonparametric forecasting and regression for stationary ergodic sequences., Strong universal pointwise consistency of some regression function estimates, The Hilbert kernel regression estimate., Nonparametric quantile estimation using surrogate models and importance sampling, Aggregation using input-output trade-off, On the strong universal consistency of local averaging regression estimates, Strong consistency of a kernel-based rule for spatially dependent data, Adaptive density estimation based on real and artificial data, Prediction from randomly right censored data, Estimation of a regression function corresponding to latent~variables, On cross-validation in kernel and partitioning regression estimation.



Cites Work