Risk bounds for CART classifiers under a margin condition
From MaRDI portal
Publication:437488
DOI10.1016/J.PATCOG.2012.02.021zbMATH Open1242.62055arXiv0902.3130OpenAlexW2008282268MaRDI QIDQ437488FDOQ437488
Publication date: 17 July 2012
Published in: Pattern Recognition (Search for Journal in Brave)
Abstract: Risk bounds for Classification and Regression Trees (CART, Breiman et. al. 1984) classifiers are obtained under a margin condition in the binary supervised classification framework. These risk bounds are obtained conditionally on the construction of the maximal deep binary tree and permit to prove that the linear penalty used in the CART pruning algorithm is valid under a margin condition. It is also shown that, conditionally on the construction of the maximal tree, the final selection by test sample does not alter dramatically the estimation accuracy of the Bayes classifier. In the two-class classification framework, the risk bounds that are proved, obtained by using penalized model selection, validate the CART algorithm which is used in many data mining applications such as Biology, Medicine or Image Coding.
Full work available at URL: https://arxiv.org/abs/0902.3130
Bayesian inference (62F15) Classification and discrimination; cluster analysis (statistical aspects) (62H30)
Cites Work
- A decision-theoretic generalization of on-line learning and an application to boosting
- The elements of statistical learning. Data mining, inference, and prediction
- Title not available (Why is that?)
- Boosting the margin: a new explanation for the effectiveness of voting methods
- Arcing classifiers. (With discussion)
- Title not available (Why is that?)
- Title not available (Why is that?)
- Smooth discrimination analysis
- Local Rademacher complexities and oracle inequalities in risk minimization. (2004 IMS Medallion Lecture). (With discussions and rejoinder)
- Une inégalité de Bennett pour les maxima de processus empiriques. (A Bennet type inequality for maxima of empirical processes)
- Model Selection for CART Regression Trees
- Some applications of concentration inequalities to statistics
- Optimal aggregation of classifiers in statistical learning.
- Title not available (Why is that?)
- Simultaneous adaptation to the margin and to complexity in classification
- Theory of Classification: a Survey of Some Recent Advances
- Risk bounds for statistical learning
- Square root penalty: Adaption to the margin in classification and in edge estimation
- Analysis of a complexity-based pruning scheme for classification trees
- Optimal dyadic decision trees
- Risk Bounds for Embedded Variable Selection in Classification Trees
- Minimax-optimal classification with dyadic decision trees
- On the Rate of Convergence of Local Averaging Plug-In Classification Rules Under a Margin Condition
- Validating Classification Trees
- Recursive partitioning to reduce distortion
- Margin-adaptive model selection in statistical learning
- Termination and continuity of greedy growing for tree-structured vector quantizers
- Tree pruning with subadditive penalties
- Extrapolative problems in automatic control and the method of potential functions
Cited In (4)
Uses Software
Recommendations
This page was built for publication: Risk bounds for CART classifiers under a margin condition
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q437488)