Using boosting to prune double-bagging ensembles
From MaRDI portal
Publication:961263
DOI10.1016/j.csda.2008.10.040zbMath1452.62145OpenAlexW2022733315MaRDI QIDQ961263
Jiang-She Zhang, Chun-Xia Zhang, Gai-Ying Zhang
Publication date: 30 March 2010
Published in: Computational Statistics and Data Analysis (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1016/j.csda.2008.10.040
Computational methods for problems pertaining to statistics (62-08) Classification and discrimination; cluster analysis (statistical aspects) (62H30) Learning and adaptive systems in artificial intelligence (68T05)
Related Items
Accurate ensemble pruning with PL-bagging, Ensemble classification of paired data, Multiclass classification and gene selection with a stochastic algorithm, Taxonomy for characterizing ensemble methods in classification tasks: a review and annotated bibliography, Out-of-Bag Estimation of the Optimal Hyperparameter in SubBag Ensemble Method
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Greedy function approximation: A gradient boosting machine.
- Bagging predictors
- Aggregating classifiers with mathematical programming
- Bundling classifiers by bagging trees
- A stochastic approximation view of boosting
- A local boosting algorithm for solving classification problems
- Robustified \(L_2\) boosting
- Multivariate adaptive regression splines
- A decision-theoretic generalization of on-line learning and an application to boosting
- Double-bagging: Combining classifiers by bootstrap aggregation
- Constructing support vector machine ensemble.
- Ensembling neural networks: Many could be better than all
- Arcing classifiers. (With discussion)
- Boosting the margin: a new explanation for the effectiveness of voting methods
- Additive logistic regression: a statistical view of boosting. (With discussion and a rejoinder by the authors)
- An approach to the automatic design of multiple classifier systems
- Combining Pattern Classifiers
- Experiments with AdaBoost.RT, an Improved Boosting Scheme for Regression
- Soft margins for AdaBoost
- Random forests
- Decision templates for multiple classifier fusion: An experimental comparison
- Using iterated bagging to debias regressions
- Boosting methods for regression
- Cost complexity-based pruning of ensemble classifiers