Accurate ensemble pruning with PL-bagging
DOI10.1016/J.CSDA.2014.09.003OpenAlexW1982712747MaRDI QIDQ1623764FDOQ1623764
Authors: Dongjun Chung, Hyunjoong Kim
Publication date: 23 November 2018
Published in: Computational Statistics and Data Analysis (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1016/j.csda.2014.09.003
Recommendations
Computational methods for problems pertaining to statistics (62-08) Classification and discrimination; cluster analysis (statistical aspects) (62H30) Learning and adaptive systems in artificial intelligence (68T05)
Cites Work
- A decision-theoretic generalization of on-line learning and an application to boosting
- Least angle regression. (With discussion)
- Title not available (Why is that?)
- Title not available (Why is that?)
- Random forests
- Bagging predictors
- Double-bagging: Combining classifiers by bootstrap aggregation
- Trimmed bagging
- Bundling classifiers by bagging trees
- Improving the precision of classification trees
- Title not available (Why is that?)
- A weight-adjusted voting algorithm for ensembles of classifiers
- Using boosting to prune double-bagging ensembles
Cited In (8)
- Collective-agreement-based pruning of ensembles
- Cost complexity-based pruning of ensemble classifiers
- Joint leaf-refinement and ensemble pruning through \(L_1\) regularization
- Random forest pruning techniques: a recent review
- Pruning of error correcting output codes by optimization of accuracy-diversity trade off
- Using boosting to prune double-bagging ensembles
- Pruning variable selection ensembles
- Title not available (Why is that?)
Uses Software
This page was built for publication: Accurate ensemble pruning with PL-bagging
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q1623764)