Exact bootstrap \(k\)-nearest neighbor learners
From MaRDI portal
Publication:1009331
DOI10.1007/s10994-008-5096-0zbMath1470.68179MaRDI QIDQ1009331
Publication date: 31 March 2009
Published in: Machine Learning (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1007/s10994-008-5096-0
62M20: Inference from stochastic processes and prediction
62H30: Classification and discrimination; cluster analysis (statistical aspects)
68T05: Learning and adaptive systems in artificial intelligence
Related Items
Theoretical analysis of cross-validation for estimating the risk of the k-Nearest Neighbor classifier, Optimal Nonparametric Inference with Two-Scale Distributional Nearest Neighbors, On the layered nearest neighbour estimate, the bagged nearest neighbour estimate and the random forest method in regression and classification
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Bagging predictors
- Bagging equalizes influence
- On bagging and nonlinear estimation
- Convergence of sample paths of normalized sums of induced order statistics
- A decision-theoretic generalization of on-line learning and an application to boosting
- Additive logistic regression: a statistical view of boosting. (With discussion and a rejoinder by the authors)
- Analyzing bagging
- On linear discriminant analysis with adaptive ridge classification rules
- Locally Weighted Regression: An Approach to Regression Analysis by Local Fitting
- Local Regression and Likelihood
- Improvements on Cross-Validation: The .632+ Bootstrap Method
- The Exact Bootstrap Mean and Variance of an L-estimator
- Properties of Bagged Nearest Neighbour Classifiers
- Nearest neighbor pattern classification
- Ridge Regression: Biased Estimation for Nonorthogonal Problems
- The elements of statistical learning. Data mining, inference, and prediction
- Random forests