An equivalence theorem for \(L_ 1\) convergence of the kernel regression estimate
From MaRDI portal
Publication:1262650
DOI10.1016/0378-3758(89)90040-2zbMath0686.62027OpenAlexW2065086209MaRDI QIDQ1262650
Publication date: 1989
Published in: Journal of Statistical Planning and Inference (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1016/0378-3758(89)90040-2
strong convergenceconsistencycomplete convergencenonparametric regressionL1-convergenceregression functionkernel regression estimatealmost surelyequivalence of modes of convergencein probability
Related Items (46)
Learning and Convergence of the Normalized Radial Basis Functions Networks ⋮ Estimation of the optimal design of a nonlinear parametric regression problem via Monte Carlo experiments ⋮ On radial basis function nets and kernel regression: Statistical consistency, convergence rates, and receptive field size ⋮ Estimation of a density in a simulation model ⋮ Reducing false positives of network anomaly detection by local adaptive multivariate smoothing ⋮ Fixed-design regression estimation based on real and artificial data ⋮ On the kernel rule for function classification ⋮ Nonparametric estimation of a function from noiseless observations at random points ⋮ On the consistency of a new kernel rule for spatially dependent data ⋮ On nonparametric classification for weakly dependent functional processes ⋮ Optimal global rates of convergence for interpolation problems with random design ⋮ Nonparametric discrimination of areal functional data ⋮ Estimation of a jump point in random design regression ⋮ On Learning and Convergence of RBF Networks in Regression Estimation and Classification ⋮ Nonparametric estimation of non-stationary velocity fields from 3D particle tracking velocimetry data ⋮ On the correct regression function (in \(L_{2}\)) and its applications when the dimension of the covariate vector is random ⋮ Optimal global rates of convergence for noiseless regression estimation problems with adaptively chosen design ⋮ On estimation of surrogate models for multivariate computer experiments ⋮ Nonparametric estimation of a maximum of quantiles ⋮ On regression and classification with possibly missing response variables in the data ⋮ Nonparametric quantile estimation using importance sampling ⋮ On the \(L_p\) norms of kernel regression estimators for incomplete data with applications to classification ⋮ Rate of convergence of the density estimation of regression residual ⋮ Classifier performance as a function of distributional complexity ⋮ Estimation of extreme quantiles in a simulation model ⋮ Kernel classification with missing data and the choice of smoothing parameters ⋮ Strong universal consistency of smooth kernel regression estimates ⋮ Exponential-Bound Property of Estimators and Variable Selection in Generalized Additive Models ⋮ Strong consistency of kernel estimates of regression function under dependence ⋮ Regression Estimation from an Individual Stable Sequence ⋮ Local averaging estimates of the regression function with twice censored data ⋮ On classification with nonignorable missing data ⋮ Nonparametric estimation of a latent variable model ⋮ Smoothing spline regression estimation based on real and artificial data ⋮ A kernel-type regression estimator for NMAR response variables with applications to classification ⋮ Strongly consistent nonparametric forecasting and regression for stationary ergodic sequences. ⋮ Strong universal pointwise consistency of some regression function estimates ⋮ The Hilbert kernel regression estimate. ⋮ Nonparametric quantile estimation using surrogate models and importance sampling ⋮ Aggregation using input-output trade-off ⋮ On the strong universal consistency of local averaging regression estimates ⋮ Strong consistency of a kernel-based rule for spatially dependent data ⋮ Adaptive density estimation based on real and artificial data ⋮ Prediction from randomly right censored data ⋮ Estimation of a regression function corresponding to latent~variables ⋮ On cross-validation in kernel and partitioning regression estimation.
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Distribution-free pointwise consistency of kernel regression estimate
- The equivalence of weak, strong, and complete convergence in \(L_ 1\) for kernel density estimates
- The kernel estimate is relatively stable
- Distribution-free consistency results in nonparametric discrimination and regression function estimation
- Consistent window estimation in nonparametric regression
- Optimal rates of convergence for nonparametric estimators
- On the almost everywhere convergence of nonparametric regression function estimates
- Weighted sums of certain dependent random variables
- Distribution-free consistency of a nonparametric kernel regression estimate and classification
- Estimation Non-paramétrique de la Régression: Revue Bibliographique
- The rates of convergence of kernel regression estimates and classification rules
- Probability Inequalities for Sums of Bounded Random Variables
- Some Results on the Complete and Almost Sure Convergence of Linear Combinations of Independent Random Variables and Martingale Differences
- Remarks on Non-Parametric Estimates for Density Functions and Regression Curves
This page was built for publication: An equivalence theorem for \(L_ 1\) convergence of the kernel regression estimate