An equivalence theorem for \(L_ 1\) convergence of the kernel regression estimate

From MaRDI portal
Revision as of 09:37, 31 January 2024 by Import240129110113 (talk | contribs) (Created automatically from import240129110113)
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)

Publication:1262650

DOI10.1016/0378-3758(89)90040-2zbMath0686.62027OpenAlexW2065086209MaRDI QIDQ1262650

Adam Krzyżak, Luc P. Devroye

Publication date: 1989

Published in: Journal of Statistical Planning and Inference (Search for Journal in Brave)

Full work available at URL: https://doi.org/10.1016/0378-3758(89)90040-2






Related Items (46)

Learning and Convergence of the Normalized Radial Basis Functions NetworksEstimation of the optimal design of a nonlinear parametric regression problem via Monte Carlo experimentsOn radial basis function nets and kernel regression: Statistical consistency, convergence rates, and receptive field sizeEstimation of a density in a simulation modelReducing false positives of network anomaly detection by local adaptive multivariate smoothingFixed-design regression estimation based on real and artificial dataOn the kernel rule for function classificationNonparametric estimation of a function from noiseless observations at random pointsOn the consistency of a new kernel rule for spatially dependent dataOn nonparametric classification for weakly dependent functional processesOptimal global rates of convergence for interpolation problems with random designNonparametric discrimination of areal functional dataEstimation of a jump point in random design regressionOn Learning and Convergence of RBF Networks in Regression Estimation and ClassificationNonparametric estimation of non-stationary velocity fields from 3D particle tracking velocimetry dataOn the correct regression function (in \(L_{2}\)) and its applications when the dimension of the covariate vector is randomOptimal global rates of convergence for noiseless regression estimation problems with adaptively chosen designOn estimation of surrogate models for multivariate computer experimentsNonparametric estimation of a maximum of quantilesOn regression and classification with possibly missing response variables in the dataNonparametric quantile estimation using importance samplingOn the \(L_p\) norms of kernel regression estimators for incomplete data with applications to classificationRate of convergence of the density estimation of regression residualClassifier performance as a function of distributional complexityEstimation of extreme quantiles in a simulation modelKernel classification with missing data and the choice of smoothing parametersStrong universal consistency of smooth kernel regression estimatesExponential-Bound Property of Estimators and Variable Selection in Generalized Additive ModelsStrong consistency of kernel estimates of regression function under dependenceRegression Estimation from an Individual Stable SequenceLocal averaging estimates of the regression function with twice censored dataOn classification with nonignorable missing dataNonparametric estimation of a latent variable modelSmoothing spline regression estimation based on real and artificial dataA kernel-type regression estimator for NMAR response variables with applications to classificationStrongly consistent nonparametric forecasting and regression for stationary ergodic sequences.Strong universal pointwise consistency of some regression function estimatesThe Hilbert kernel regression estimate.Nonparametric quantile estimation using surrogate models and importance samplingAggregation using input-output trade-offOn the strong universal consistency of local averaging regression estimatesStrong consistency of a kernel-based rule for spatially dependent dataAdaptive density estimation based on real and artificial dataPrediction from randomly right censored dataEstimation of a regression function corresponding to latent~variablesOn cross-validation in kernel and partitioning regression estimation.




Cites Work




This page was built for publication: An equivalence theorem for \(L_ 1\) convergence of the kernel regression estimate