Properties of Bagged Nearest Neighbour Classifiers
From MaRDI portal
Publication:5313456
DOI10.1111/j.1467-9868.2005.00506.xzbMath1069.62051OpenAlexW2150555551MaRDI QIDQ5313456
Hall, Peter, Richard J. Samworth
Publication date: 1 September 2005
Published in: Journal of the Royal Statistical Society Series B: Statistical Methodology (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1111/j.1467-9868.2005.00506.x
predictionbootstrapdensitycross-validationmarked point processPoisson processdiscriminationerror rateBayes riskregretstatistical learningclassification errorwith-replacement samplingwithout-replacement sampling
Classification and discrimination; cluster analysis (statistical aspects) (62H30) Bayesian inference (62F15)
Related Items
Partial Least Squares for Heterogeneous Data, Measuring the Algorithmic Convergence of Randomized Ensembles: The Regression Setting, Ensemble of a subset of \(k\)NN classifiers, Optimal Nonparametric Inference with Two-Scale Distributional Nearest Neighbors, Estimating the algorithmic variance of randomized ensembles via the bootstrap, Cross-validated bagged learning, Out-of-bag estimation of the optimal sample size in bagging, Exact bootstrap \(k\)-nearest neighbor learners, Optimal weighted nearest neighbour classifiers, ESTIMATING A PARAMETER WHEN IT IS KNOWN THAT THE PARAMETER EXCEEDS A GIVEN VALUE, Estimating a sharp convergence bound for randomized ensembles, Probability estimation with machine learning methods for dichotomous and multicategory outcome: Theory, Probability estimation with machine learning methods for dichotomous and multicategory outcome: Applications
Cites Work
- Unnamed Item
- Local data-driven bandwidth choice for density estimation
- On bagging and nonlinear estimation
- A local cross-validation algorithm
- An introduction to the theory of point processes
- A decision-theoretic generalization of on-line learning and an application to boosting
- Boosting the margin: a new explanation for the effectiveness of voting methods
- Consistency of data-driven histogram methods for density estimation and classification
- On weak base hypotheses and their implications for boosting regression and classification
- Analyzing bagging
- Comparison of Discrimination Methods for the Classification of Tumors Using Gene Expression Data
- Minimax nonparametric classification .I. Rates of convergence
- On the finite sample performance of the nearest neighbor classifier
- Nearest neighbor pattern classification
- Random forests
- Using iterated bagging to debias regressions