Rates of convergence for the k-nearest neighbor estimators with smoother regression functions
From MaRDI portal
Publication:447616
DOI10.1016/J.JSPI.2012.03.012zbMATH Open1428.62149arXiv1102.5633OpenAlexW2074895514MaRDI QIDQ447616FDOQ447616
Publication date: 4 September 2012
Published in: Journal of Statistical Planning and Inference (Search for Journal in Brave)
Abstract: In regression analysis one wants to estimate the regression function from a data. In this paper we consider the rate of convergence for the nearest neighbor estimator in case that the regression function is -smooth. It is an open problem whether the optimal rate can be achieved by some nearest neighbor estimator in case that is on (1,1.5]. We solve the problem affirmatively. This is the main result of this paper. Throughout this paper, we assume that the data is independent and identically distributed and as an error criterion we use the expected error.
Full work available at URL: https://arxiv.org/abs/1102.5633
Recommendations
- scientific article; zbMATH DE number 1536234
- Strong convergence rates of nearest neighbor estimators for regression functions under random censoring
- scientific article
- Strong uniform consistency of \(k\)-nearest neighbor regression function estimators
- Strong consistency of nearest neighbor regression function estimators
Nonparametric estimation (62G05) Nonparametric regression and quantile regression (62G08) Asymptotic properties of nonparametric inference (62G20)
Cites Work
- Consistent nonparametric regression. Discussion
- Optimal global rates of convergence for nonparametric regression
- Histogram regression estimation using data-dependent partitions
- A distribution-free theory of nonparametric regression
- Consistent window estimation in nonparametric regression
- On the strong universal consistency of nearest neighbor regression function estimates
- Nonparametric regression estimation using penalized least squares
- Nonparametric estimation via empirical risk minimization
- Rates of convergence of nearest neighbor estimation under arbitrary sampling
- Rates of convergence for partitioning and nearest neighbor regression estimates with unbounded data
- Optimal global rates of convergence for nonparametric regression with unbounded data
- Universally consistent regression function estimation using hierarchical \(B\)-splines
- On the strong universal consistency of a recursive regression estimate by Pál Révész
- Universal consistency of local polynomial kernel regression estimates
- Strong universal consistency of smooth kernel regression estimates
- The rate of convergence of<tex>k_n</tex>-NN regression estimates and classification rules (Corresp.)
- Distribution-free pointwise consistency of kernel regression estimate
- A universal strong law of large numbers for conditional expectations via nearest neighbors
Cited In (6)
- Title not available (Why is that?)
- Rates of convergence for partitioning and nearest neighbor regression estimates with unbounded data
- Title not available (Why is that?)
- Analysis of KNN Information Estimators for Smooth Distributions
- The optimal rate of convergence of error for k-nn median regression estimates
- Global and local two-sample tests via regression
This page was built for publication: Rates of convergence for the \(k\)-nearest neighbor estimators with smoother regression functions
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q447616)