Arcing classifiers. (With discussion)
From MaRDI portal
Publication:1807115
DOI10.1214/AOS/1024691079zbMATH Open0934.62064OpenAlexW2067885219WikidataQ56114421 ScholiaQ56114421MaRDI QIDQ1807115FDOQ1807115
Authors: Leo Breiman
Publication date: 9 November 1999
Published in: The Annals of Statistics (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1214/aos/1024691079
Recommendations
Markov chain Monte Carloboostingdecision treesensemble methodsneural networksbaggingerror-correctingoutput coding
Cites Work
- A decision-theoretic generalization of on-line learning and an application to boosting
- Heuristics of instability and stabilization in model selection
- Title not available (Why is that?)
- Bagging predictors
- Title not available (Why is that?)
- Title not available (Why is that?)
- Boosting the margin: a new explanation for the effectiveness of voting methods
- Title not available (Why is that?)
- Boosting a weak learning algorithm by majority
- Title not available (Why is that?)
- Estimation of dependences based on empirical data. Transl. from the Russian by Samuel Kotz
- A theory of the learnable
- Cryptographic limitations on learning Boolean formulae and finite automata
- A conversation with Leo Breiman.
- A Recursive Partitioning Decision Rule for Nonparametric Classification
Cited In (only showing first 100 items - show all)
- A conversation with Larry Brown
- A conversation with Jerry Friedman
- Semiparametric regression during 2003--2007
- Improving nonparametric regression methods by bagging and boosting.
- A note on margin-based loss functions in classification
- Boosting the margin: a new explanation for the effectiveness of voting methods
- Bundling classifiers by bagging trees
- Additive logistic regression: a statistical view of boosting. (With discussion and a rejoinder by the authors)
- Trimmed bagging
- Boosting in the presence of outliers: adaptive classification with nonconvex loss functions
- New aspects of Bregman divergence in regression and classification with parametric and nonparametric estimation
- Double-bagging: Combining classifiers by bootstrap aggregation
- Learning model trees from evolving data streams
- Hellinger distance decision trees are robust and skew-insensitive
- Interpreting neural-network results: a simulation study.
- A nonlinear aggregation type classifier
- Remembering Leo Breiman
- Remembrance of Leo Breiman
- On the Bayes-risk consistency of regularized boosting methods.
- Forecasting China's foreign trade volume with a kernel-based hybrid econometric-AI ensemble learning approach
- Analyzing bagging
- A statistical approach to growing a reliable honest tree.
- Statistical behavior and consistency of classification methods based on convex risk minimization.
- An empirical comparison of ensemble methods based on classification trees
- Quadratic boosting
- Boosting and instability for regression trees
- Experimental study for the comparison of classifier combination methods
- Boosting for high-dimensional linear models
- Quantum adiabatic machine learning
- Aggregating classifiers with ordinal response structure
- Risk bounds for CART classifiers under a margin condition
- Bandwidth choice for nonparametric classification
- Boosting in Cox regression: a comparison between the likelihood-based and the model-based approaches with focus on the R-packages \textit{CoxBoost} and \textit{mboost}
- Bias-corrected random forests in regression
- Boosting with early stopping: convergence and consistency
- A conversation with Leo Breiman.
- Accelerated gradient boosting
- Out-of-bag estimation of the optimal sample size in bagging
- Density estimation with stagewise optimization of the empirical risk
- \(L_{2}\) boosting in kernel regression
- Improved customer choice predictions using ensemble methods
- Comment on: Boosting algorithms: regularization, prediction and model fitting
- Stochastic boosting algorithms
- Stochastic boosting algorithms
- Pruning of error correcting output codes by optimization of accuracy-diversity trade off
- A weight-adjusted voting algorithm for ensembles of classifiers
- An extensive comparison of recent classification tools applied to microarray data
- Model-based boosting in R: a hands-on tutorial using the R package mboost
- A local boosting algorithm for solving classification problems
- Ensemble Subset Regression (ENSURE): Efficient High-dimensional Prediction
- Population theory for boosting ensembles.
- A cooperative constructive method for neural networks for pattern recognition
- A novel sparse least squares support vector machines
- A bootstrap-based aggregate classifier for model-based clustering
- Using boosting to prune double-bagging ensembles
- Process consistency for AdaBoost.
- Accurate tree-based missing data imputation and data fusion within the statistical learning paradigm
- Taxonomy for characterizing ensemble methods in classification tasks: a review and annotated bibliography
- Survival ensembles by the sum of pairwise differences with application to lung cancer microarray studies
- Boosting algorithms: regularization, prediction and model fitting
- Nonparametric multiple expectile regression via ER-Boost
- Extremely randomized trees
- Analysis of boosting algorithms using the smooth margin function
- Comparing boosting and bagging for decision trees of rankings
- Complexities of convex combinations and bounding the generalization error in classification
- Boosting conditional probability estimators
- Variable selection for generalized linear mixed models by \(L_1\)-penalized estimation
- Title not available (Why is that?)
- Optimization by Gradient Boosting
- Multivariate data analysis and modeling through classification and regression trees
- Theory of Classification: a Survey of Some Recent Advances
- Extremely randomized trees
- Boosting additive models using component-wise P-splines
- Statistical modeling: The two cultures. (With comments and a rejoinder).
- Adjusting the outputs of a classifier to new a priori probabilities: A simple procedure
- Attractor networks for shape recognition
- Generalised indirect classifiers
- Objective model selection with parallel genetic algorithms using an eradication strategy
- Sparse projection oblique randomer forests
- Different Paradigms for Choosing Sequential Reweighting Algorithms
- A new accelerated proximal boosting machine with convergence rate \(O(1/t^2)\)
- Classification by ensembles from random partitions of high-dimensional data
- Title not available (Why is that?)
- Delta Boosting Machine with Application to General Insurance
- Time series forecasting with multiple candidate models: selecting or combining?
- Bounding the generalization error of convex combinations of classifiers: Balancing the dimensionality and the margins.
- A Bayesian Random Split to Build Ensembles of Classification Trees
- SVM-boosting based on Markov resampling: theory and algorithm
- Variance reduction trends on `boosted' classifiers
- Machine learning acceleration for nonlinear solvers applied to multiphase porous media flow
- General sparse boosting: improving feature selection of \(L_{2}\) boosting by correlation-based penalty family
- Structural, Syntactic, and Statistical Pattern Recognition
- Application of “Aggregated Classifiers” in Survival Time Studies
- Boosting with Noisy Data: Some Views from Statistical Theory
- Gradient boosting for linear mixed models
- Vote counting measures for ensemble classifiers.
- An empirical bias–variance analysis of DECORATE ensemble method at different training sample sizes
- Classifying G-protein coupled receptors with bagging classification tree
- Classification by evolutionary ensembles
- Diversification for better classification trees
Uses Software
This page was built for publication: Arcing classifiers. (With discussion)
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q1807115)