Risk bounds for CART classifiers under a margin condition
From MaRDI portal
(Redirected from Publication:437488)
Abstract: Risk bounds for Classification and Regression Trees (CART, Breiman et. al. 1984) classifiers are obtained under a margin condition in the binary supervised classification framework. These risk bounds are obtained conditionally on the construction of the maximal deep binary tree and permit to prove that the linear penalty used in the CART pruning algorithm is valid under a margin condition. It is also shown that, conditionally on the construction of the maximal tree, the final selection by test sample does not alter dramatically the estimation accuracy of the Bayes classifier. In the two-class classification framework, the risk bounds that are proved, obtained by using penalized model selection, validate the CART algorithm which is used in many data mining applications such as Biology, Medicine or Image Coding.
Recommendations
Cites work
- scientific article; zbMATH DE number 3860199 (Why is no real title available?)
- scientific article; zbMATH DE number 1332320 (Why is no real title available?)
- scientific article; zbMATH DE number 3446442 (Why is no real title available?)
- scientific article; zbMATH DE number 893887 (Why is no real title available?)
- A decision-theoretic generalization of on-line learning and an application to boosting
- Analysis of a complexity-based pruning scheme for classification trees
- Arcing classifiers. (With discussion)
- Boosting the margin: a new explanation for the effectiveness of voting methods
- Extrapolative problems in automatic control and the method of potential functions
- Local Rademacher complexities and oracle inequalities in risk minimization. (2004 IMS Medallion Lecture). (With discussions and rejoinder)
- Margin-adaptive model selection in statistical learning
- Minimax-optimal classification with dyadic decision trees
- Model Selection for CART Regression Trees
- On the Rate of Convergence of Local Averaging Plug-In Classification Rules Under a Margin Condition
- Optimal aggregation of classifiers in statistical learning.
- Optimal dyadic decision trees
- Recursive partitioning to reduce distortion
- Risk Bounds for Embedded Variable Selection in Classification Trees
- Risk bounds for statistical learning
- Simultaneous adaptation to the margin and to complexity in classification
- Smooth discrimination analysis
- Some applications of concentration inequalities to statistics
- Square root penalty: Adaption to the margin in classification and in edge estimation
- Termination and continuity of greedy growing for tree-structured vector quantizers
- The elements of statistical learning. Data mining, inference, and prediction
- Theory of Classification: a Survey of Some Recent Advances
- Tree pruning with subadditive penalties
- Une inégalité de Bennett pour les maxima de processus empiriques. (A Bennet type inequality for maxima of empirical processes)
- Validating Classification Trees
Cited in
(4)
This page was built for publication: Risk bounds for CART classifiers under a margin condition
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q437488)