Improved classification rates under refined margin conditions
From MaRDI portal
Abstract: In this paper we present a simple partitioning based technique to refine the statistical analysis of classification algorithms. The core of our idea is to divide the input space into two parts such that the first part contains a suitable vicinity around the decision boundary, while the second part is sufficiently far away from the decision boundary. Using a set of margin conditions we are then able to control the classification error on both parts separately. By balancing out these two error terms we obtain a refined error analysis in a final step. We apply this general idea to the histogram rule and show that even for this simple method we obtain, under certain assumptions, better rates than the ones known for support vector machines, for certain plug-in classifiers, and for a recently analyzed tree based adaptive-partitioning ansatz. Moreover, we show that a margin condition which sets the critical noise in relation to the decision boundary makes it possible to improve the optimal rates proven for distributions without this margin condition.
Recommendations
- On the Rate of Convergence of Local Averaging Plug-In Classification Rules Under a Margin Condition
- Improved classification rates for localized algorithms under margin conditions
- Classification algorithms using adaptive partitioning
- On the rate of convergence of error estimates for the partitioning classification rule
- Rate of convergence of \(k\)-nearest-neighbor classification rule
Cites work
- scientific article; zbMATH DE number 893887 (Why is no real title available?)
- scientific article; zbMATH DE number 3280855 (Why is no real title available?)
- Classification algorithms using adaptive partitioning
- Fast learning rates for plug-in classifiers
- Fully adaptive density-based clustering
- On the Rate of Convergence of Local Averaging Plug-In Classification Rules Under a Margin Condition
- Random forests
- Risk bounds for statistical learning
- Support Vector Machines
Cited in
(7)- Improved classification rates for localized algorithms under margin conditions
- scientific article; zbMATH DE number 7415094 (Why is no real title available?)
- Classification algorithms using adaptive partitioning
- Rate of convergence of \(k\)-nearest-neighbor classification rule
- Intrinsic dimension adaptive partitioning for kernel methods
- Adaptive learning rates for support vector machines working on data with low intrinsic dimension
- On the Rate of Convergence of Local Averaging Plug-In Classification Rules Under a Margin Condition
This page was built for publication: Improved classification rates under refined margin conditions
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q1746541)