Recommendations
Cites work
- scientific article; zbMATH DE number 3860199 (Why is no real title available?)
- scientific article; zbMATH DE number 1149409 (Why is no real title available?)
- scientific article; zbMATH DE number 1149421 (Why is no real title available?)
- scientific article; zbMATH DE number 784362 (Why is no real title available?)
- scientific article; zbMATH DE number 823069 (Why is no real title available?)
- A Recursive Partitioning Decision Rule for Nonparametric Classification
- A conversation with Leo Breiman.
- A decision-theoretic generalization of on-line learning and an application to boosting
- A theory of the learnable
- Bagging predictors
- Boosting a weak learning algorithm by majority
- Boosting the margin: a new explanation for the effectiveness of voting methods
- Cryptographic limitations on learning Boolean formulae and finite automata
- Estimation of dependences based on empirical data. Transl. from the Russian by Samuel Kotz
- Heuristics of instability and stabilization in model selection
Cited in
(only showing first 100 items - show all)- Statistical modeling: The two cultures. (With comments and a rejoinder).
- Extremely randomized trees
- Generalised indirect classifiers
- A conversation with Larry Brown
- Adjusting the outputs of a classifier to new a priori probabilities: A simple procedure
- A conversation with Jerry Friedman
- Attractor networks for shape recognition
- Semiparametric regression during 2003--2007
- Sparse projection oblique randomer forests
- Objective model selection with parallel genetic algorithms using an eradication strategy
- Different Paradigms for Choosing Sequential Reweighting Algorithms
- A note on margin-based loss functions in classification
- Improving nonparametric regression methods by bagging and boosting.
- A new accelerated proximal boosting machine with convergence rate \(O(1/t^2)\)
- Bundling classifiers by bagging trees
- Boosting the margin: a new explanation for the effectiveness of voting methods
- Classification by ensembles from random partitions of high-dimensional data
- Additive logistic regression: a statistical view of boosting. (With discussion and a rejoinder by the authors)
- Trimmed bagging
- scientific article; zbMATH DE number 7307480 (Why is no real title available?)
- Learning model trees from evolving data streams
- Hellinger distance decision trees are robust and skew-insensitive
- New aspects of Bregman divergence in regression and classification with parametric and nonparametric estimation
- Double-bagging: Combining classifiers by bootstrap aggregation
- Boosting in the presence of outliers: adaptive classification with nonconvex loss functions
- A nonlinear aggregation type classifier
- Interpreting neural-network results: a simulation study.
- Delta Boosting Machine with Application to General Insurance
- Remembering Leo Breiman
- Remembrance of Leo Breiman
- Bounding the generalization error of convex combinations of classifiers: Balancing the dimensionality and the margins.
- Time series forecasting with multiple candidate models: selecting or combining?
- Forecasting China's foreign trade volume with a kernel-based hybrid econometric-AI ensemble learning approach
- On the Bayes-risk consistency of regularized boosting methods.
- SVM-boosting based on Markov resampling: theory and algorithm
- Analyzing bagging
- A Bayesian Random Split to Build Ensembles of Classification Trees
- Variance reduction trends on `boosted' classifiers
- Machine learning acceleration for nonlinear solvers applied to multiphase porous media flow
- Statistical behavior and consistency of classification methods based on convex risk minimization.
- A statistical approach to growing a reliable honest tree.
- Boosting and instability for regression trees
- General sparse boosting: improving feature selection of \(L_{2}\) boosting by correlation-based penalty family
- Quadratic boosting
- Structural, Syntactic, and Statistical Pattern Recognition
- An empirical comparison of ensemble methods based on classification trees
- Experimental study for the comparison of classifier combination methods
- Application of “Aggregated Classifiers” in Survival Time Studies
- Boosting for high-dimensional linear models
- Quantum adiabatic machine learning
- Boosting with Noisy Data: Some Views from Statistical Theory
- Vote counting measures for ensemble classifiers.
- Aggregating classifiers with ordinal response structure
- Classifying G-protein coupled receptors with bagging classification tree
- Risk bounds for CART classifiers under a margin condition
- An empirical bias–variance analysis of DECORATE ensemble method at different training sample sizes
- Gradient boosting for linear mixed models
- Bandwidth choice for nonparametric classification
- Boosting in Cox regression: a comparison between the likelihood-based and the model-based approaches with focus on the R-packages \textit{CoxBoost} and \textit{mboost}
- A conversation with Leo Breiman.
- Classification by evolutionary ensembles
- Boosting with early stopping: convergence and consistency
- Diversification for better classification trees
- Bias-corrected random forests in regression
- Accelerated gradient boosting
- Out-of-bag estimation of the optimal sample size in bagging
- Machine learning feature analysis illuminates disparity between E3SM climate models and observed climate change
- Assessing the stability of classification trees using Florida birth data
- Using LogitBoost classifier to predict protein structural classes
- An empirical study of using Rotation Forest to improve regressors
- Density estimation with stagewise optimization of the empirical risk
- \(L_{2}\) boosting in kernel regression
- Nested cross-validation with ensemble feature selection and classification model for high-dimensional biological data
- Improved customer choice predictions using ensemble methods
- An efficient modified boosting method for solving classification problems
- A novel ensemble model -- the random granular reflections
- Canonical forest
- Comment on: Boosting algorithms: regularization, prediction and model fitting
- Generalization error of combined classifiers.
- Pruning of error correcting output codes by optimization of accuracy-diversity trade off
- A weight-adjusted voting algorithm for ensembles of classifiers
- Modelling additive extremile regression by iteratively penalized least asymmetric weighted squares and gradient descent boosting
- On the fusion of threshold classifiers for categorization and dimensionality reduction
- Stochastic boosting algorithms
- Stochastic boosting algorithms
- An extensive comparison of recent classification tools applied to microarray data
- Arbitrating among competing classifiers using learned referees
- Model-based boosting in R: a hands-on tutorial using the R package mboost
- A local boosting algorithm for solving classification problems
- Quantum AdaBoost algorithm via cluster state
- A novel sparse least squares support vector machines
- A cooperative constructive method for neural networks for pattern recognition
- Population theory for boosting ensembles.
- Ensemble Subset Regression (ENSURE): Efficient High-dimensional Prediction
- A bootstrap-based aggregate classifier for model-based clustering
- A distributed algorithm for high-dimension convex quadratically constrained quadratic programs
- Using boosting to prune double-bagging ensembles
- A new kernel regression approach for robustified L 2 boosting
- Process consistency for AdaBoost.
- On weak base hypotheses and their implications for boosting regression and classification
This page was built for publication: Arcing classifiers. (With discussion)
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q1807115)