Use of majority votes in statistical learning
From MaRDI portal
Publication:6604473
DOI10.1002/WICS.1362zbMATH Open1545.62166MaRDI QIDQ6604473FDOQ6604473
Publication date: 12 September 2024
Published in: Wiley Interdisciplinary Reviews. WIREs Computational Statistics (Search for Journal in Brave)
Cites Work
- Greedy function approximation: A gradient boosting machine.
- Bayesian model averaging: A tutorial. (with comments and a rejoinder).
- The Adaptive Lasso and Its Oracle Properties
- Title not available (Why is that?)
- Variable Selection via Nonconcave Penalized Likelihood and its Oracle Properties
- Stability Selection
- Title not available (Why is that?)
- Title not available (Why is that?)
- Random forests
- Bagging predictors
- High-dimensional graphs and variable selection with the Lasso
- Title not available (Why is that?)
- Additive logistic regression: a statistical view of boosting. (With discussion and a rejoinder by the authors)
- Boosting algorithms: regularization, prediction and model fitting
- Bayes Factors
- Boosting with early stopping: convergence and consistency
- Random lasso
- Boosting With theL2Loss
- Stochastic gradient boosting.
- Measuring the Accuracy of Diagnostic Systems
- Combining Pattern Classifiers
- Title not available (Why is that?)
- Title not available (Why is that?)
- Title not available (Why is that?)
- New multicategory boosting algorithms based on multicategory Fisher-consistent losses
- Majority Voting by Independent Classifiers Can Increase Error Rates
This page was built for publication: Use of majority votes in statistical learning
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q6604473)