Arcing classifiers. (With discussion)
From MaRDI portal
Publication:1807115
DOI10.1214/aos/1024691079zbMath0934.62064OpenAlexW2067885219WikidataQ56114421 ScholiaQ56114421MaRDI QIDQ1807115
Publication date: 9 November 1999
Published in: The Annals of Statistics (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1214/aos/1024691079
neural networksMarkov chain Monte Carlodecision treesbaggingboostingensemble methodsoutput codingerror-correcting
Related Items
Deep learning: a statistical viewpoint, Tweedie gradient boosting for extremely unbalanced zero-inflated data, CatBoost — An Ensemble Machine Learning Model for Prediction and Classification of Student Academic Performance, Ensemble Subset Regression (ENSURE): Efficient High-dimensional Prediction, Estimation and inference of treatment effects with \(L_2\)-boosting in high-dimensional settings, Nested cross-validation with ensemble feature selection and classification model for high-dimensional biological data, Bias-corrected random forests in regression, An empirical bias–variance analysis of DECORATE ensemble method at different training sample sizes, An empirical comparison of ensemble methods based on classification trees, Extremely randomized trees, A Novel Ensemble Model - The Random Granular Reflections, Extremely randomized trees, Improving nonparametric regression methods by bagging and boosting., A statistical approach to growing a reliable honest tree., Stochastic boosting algorithms, Nonparametric multiple expectile regression via ER-Boost, Unnamed Item, Stochastic boosting algorithms, A Markov-modulated tree-based gradient boosting model for auto-insurance risk premium pricing, General Sparse Boosting: Improving Feature Selection of L2Boosting by Correlation-Based Penalty Family, Optimization by Gradient Boosting, Bounding the generalization error of convex combinations of classifiers: Balancing the dimensionality and the margins., Generalization error of combined classifiers., A conversation with Larry Brown, A nonlinear aggregation type classifier, Comparing boosting and bagging for decision trees of rankings, Population theory for boosting ensembles., Process consistency for AdaBoost., On the Bayes-risk consistency of regularized boosting methods., Statistical behavior and consistency of classification methods based on convex risk minimization., Classification by evolutionary ensembles, A precise high-dimensional asymptotic theory for boosting and minimum-\(\ell_1\)-norm interpolated classifiers, Density estimation with stagewise optimization of the empirical risk, Quadratic boosting, Experimental study for the comparison of classifier combination methods, A cooperative constructive method for neural networks for pattern recognition, Boosting with Noisy Data: Some Views from Statistical Theory, Different Paradigms for Choosing Sequential Reweighting Algorithms, Quantum adiabatic machine learning, Unnamed Item, Improved customer choice predictions using ensemble methods, \(L_{2}\) boosting in kernel regression, Pruning of error correcting output codes by optimization of accuracy-diversity trade off, Variable selection for generalized linear mixed models by \(L_1\)-penalized estimation, Learning model trees from evolving data streams, Hellinger distance decision trees are robust and skew-insensitive, Accelerated gradient boosting, Using LogitBoost classifier to predict protein structural classes, A bootstrap-based aggregate classifier for model-based clustering, Double-bagging: Combining classifiers by bootstrap aggregation, Risk bounds for CART classifiers under a margin condition, Semiparametric regression during 2003--2007, Survival ensembles by the sum of pairwise differences with application to lung cancer microarray studies, Statistical uncertainty estimation using random forests and its application to drought forecast, Boosting algorithms: regularization, prediction and model fitting, Comment on: Boosting algorithms: regularization, prediction and model fitting, Quantum AdaBoost algorithm via cluster state, Vote counting measures for ensemble classifiers., Detection of outliers in geochemical data using ensembles of subsets of variables, A conversation with Leo Breiman., Statistical modeling: The two cultures. (With comments and a rejoinder)., A novel sparse least squares support vector machines, Attractor Networks for Shape Recognition, Assessing the stability of classification trees using Florida birth data, Application of “Aggregated Classifiers” in Survival Time Studies, Bagging Tree Classifiers for Glaucoma Diagnosis, Machine learning acceleration for nonlinear solvers applied to multiphase porous media flow, An empirical study of using Rotation Forest to improve regressors, Boosting conditional probability estimators, Analysis of boosting algorithms using the smooth margin function, An extensive comparison of recent classification tools applied to microarray data, Generalised indirect classifiers, Bundling classifiers by bagging trees, Boosting and instability for regression trees, Boosting additive models using component-wise P-splines, Using boosting to prune double-bagging ensembles, An efficient modified boosting method for solving classification problems, Taxonomy for characterizing ensemble methods in classification tasks: a review and annotated bibliography, Model-based boosting in R: a hands-on tutorial using the R package mboost, Remembering Leo Breiman, Remembrance of Leo Breiman, Canonical forest, Classifying G-protein coupled receptors with bagging classification tree, A note on margin-based loss functions in classification, Adjusting the Outputs of a Classifier to New a Priori Probabilities: A Simple Procedure, Aggregating classifiers with ordinal response structure, Boosting for high-dimensional linear models, Diversification for better classification trees, Bandwidth choice for nonparametric classification, Machine learning feature analysis illuminates disparity between E3SM climate models and observed climate change, Delta Boosting Machine with Application to General Insurance, On the fusion of threshold classifiers for categorization and dimensionality reduction, Out-of-bag estimation of the optimal sample size in bagging, ON MATHEMATICAL MODELLING OF SYNTHETIC MEASURES, Boosting in Cox regression: a comparison between the likelihood-based and the model-based approaches with focus on the R-packages \textit{CoxBoost} and \textit{mboost}, Theory of Classification: a Survey of Some Recent Advances, Boosting in the Presence of Outliers: Adaptive Classification With Nonconvex Loss Functions, A weight-adjusted voting algorithm for ensembles of classifiers, Boosting the margin: a new explanation for the effectiveness of voting methods, A distributed algorithm for high-dimension convex quadratically constrained quadratic programs, New aspects of Bregman divergence in regression and classification with parametric and nonparametric estimation, Classification by ensembles from random partitions of high-dimensional data, Trimmed bagging, SVM-boosting based on Markov resampling: theory and algorithm, A Bayesian Random Split to Build Ensembles of Classification Trees, A local boosting algorithm for solving classification problems, Multivariate data analysis and modeling through classification and regression trees, Forecasting China's foreign trade volume with a kernel-based hybrid econometric-AI ensemble learning approach, Boosting as a kernel-based method, Noisy replication in skewed binary classification., Additive logistic regression: a statistical view of boosting. (With discussion and a rejoinder by the authors), On weak base hypotheses and their implications for boosting regression and classification, Analyzing bagging, AdaBoost and robust one-bit compressed sensing, Time series forecasting with multiple candidate models: selecting or combining?, Complexities of convex combinations and bounding the generalization error in classification, Boosting with early stopping: convergence and consistency, A new accelerated proximal boosting machine with convergence rate \(O(1/t^2)\), Interpreting neural-network results: a simulation study., A conversation with Jerry Friedman, Accurate tree-based missing data imputation and data fusion within the statistical learning paradigm
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Bagging predictors
- Estimation of dependences based on empirical data. Transl. from the Russian by Samuel Kotz
- Heuristics of instability and stabilization in model selection
- A decision-theoretic generalization of on-line learning and an application to boosting
- A conversation with Leo Breiman.
- Boosting the margin: a new explanation for the effectiveness of voting methods
- Boosting a weak learning algorithm by majority
- A theory of the learnable
- A Recursive Partitioning Decision Rule for Nonparametric Classification
- Cryptographic limitations on learning Boolean formulae and finite automata