Exact bootstrap \(k\)-nearest neighbor learners
From MaRDI portal
Publication:1009331
DOI10.1007/s10994-008-5096-0zbMath1470.68179OpenAlexW1990493242MaRDI QIDQ1009331
Publication date: 31 March 2009
Published in: Machine Learning (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1007/s10994-008-5096-0
Inference from stochastic processes and prediction (62M20) Classification and discrimination; cluster analysis (statistical aspects) (62H30) Learning and adaptive systems in artificial intelligence (68T05)
Related Items
On the layered nearest neighbour estimate, the bagged nearest neighbour estimate and the random forest method in regression and classification, Optimal Nonparametric Inference with Two-Scale Distributional Nearest Neighbors, Theoretical analysis of cross-validation for estimating the risk of the k-Nearest Neighbor classifier
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Bagging predictors
- Bagging equalizes influence
- On bagging and nonlinear estimation
- Convergence of sample paths of normalized sums of induced order statistics
- A decision-theoretic generalization of on-line learning and an application to boosting
- Additive logistic regression: a statistical view of boosting. (With discussion and a rejoinder by the authors)
- Analyzing bagging
- On linear discriminant analysis with adaptive ridge classification rules
- Locally Weighted Regression: An Approach to Regression Analysis by Local Fitting
- Local Regression and Likelihood
- Improvements on Cross-Validation: The .632+ Bootstrap Method
- The Exact Bootstrap Mean and Variance of an L-estimator
- Properties of Bagged Nearest Neighbour Classifiers
- Nearest neighbor pattern classification
- Ridge Regression: Biased Estimation for Nonorthogonal Problems
- The elements of statistical learning. Data mining, inference, and prediction
- Random forests