A weight-adjusted voting algorithm for ensembles of classifiers
From MaRDI portal
Publication:743769
DOI10.1016/j.jkss.2011.03.002zbMath1296.62131OpenAlexW1974844180MaRDI QIDQ743769
Hyunjoong Kim, Hojin Moon, Hongshik Ahn, Hyeuk Kim
Publication date: 30 September 2014
Published in: Journal of the Korean Statistical Society (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1016/j.jkss.2011.03.002
Classification and discrimination; cluster analysis (statistical aspects) (62H30) Voting theory (91B12) Pattern recognition, speech recognition (68T10)
Related Items (6)
Accurate ensemble pruning with PL-bagging ⋮ Double random forest ⋮ Canonical forest ⋮ A long short-term memory ensemble approach for improving the outcome prediction in intensive care unit ⋮ A weighted random forests approach to improve predictive performance ⋮ Adaptive multi-classifier fusion approach for gene expression dataset based on probabilistic theory
Uses Software
Cites Work
- Bagging predictors
- Multi-class AdaBoost
- Bundling classifiers by bagging trees
- Taxonomy for characterizing ensemble methods in classification tasks: a review and annotated bibliography
- Improving the precision of classification trees
- A local boosting algorithm for solving classification problems
- Double-bagging: Combining classifiers by bootstrap aggregation
- Arcing classifiers. (With discussion)
- Additive logistic regression: a statistical view of boosting. (With discussion and a rejoinder by the authors)
- The elements of statistical learning. Data mining, inference, and prediction
- Random forests
- Unnamed Item
- Unnamed Item
- Unnamed Item
This page was built for publication: A weight-adjusted voting algorithm for ensembles of classifiers