Recommendations
Cites work
- scientific article; zbMATH DE number 3860199 (Why is no real title available?)
- scientific article; zbMATH DE number 1149409 (Why is no real title available?)
- scientific article; zbMATH DE number 1149421 (Why is no real title available?)
- scientific article; zbMATH DE number 784362 (Why is no real title available?)
- scientific article; zbMATH DE number 823069 (Why is no real title available?)
- A Recursive Partitioning Decision Rule for Nonparametric Classification
- A conversation with Leo Breiman.
- A decision-theoretic generalization of on-line learning and an application to boosting
- A theory of the learnable
- Bagging predictors
- Boosting a weak learning algorithm by majority
- Boosting the margin: a new explanation for the effectiveness of voting methods
- Cryptographic limitations on learning Boolean formulae and finite automata
- Estimation of dependences based on empirical data. Transl. from the Russian by Samuel Kotz
- Heuristics of instability and stabilization in model selection
Cited in
(only showing first 100 items - show all)- Accurate tree-based missing data imputation and data fusion within the statistical learning paradigm
- Tweedie gradient boosting for extremely unbalanced zero-inflated data
- Taxonomy for characterizing ensemble methods in classification tasks: a review and annotated bibliography
- Survival ensembles by the sum of pairwise differences with application to lung cancer microarray studies
- Estimation and inference of treatment effects with \(L_2\)-boosting in high-dimensional settings
- Insurance Premium Prediction via Gradient Tree-Boosted Tweedie Compound Poisson Models
- Boosting algorithms: regularization, prediction and model fitting
- Statistical uncertainty estimation using random forests and its application to drought forecast
- Prediction of wind loading on masked angle members in lattice tower structures
- Deep learning: a statistical viewpoint
- A Markov-modulated tree-based gradient boosting model for auto-insurance risk premium pricing
- Analysis of boosting algorithms using the smooth margin function
- Comparing boosting and bagging for decision trees of rankings
- Nonparametric multiple expectile regression via ER-Boost
- Extremely randomized trees
- Complexities of convex combinations and bounding the generalization error in classification
- Boosting conditional probability estimators
- Variable selection for generalized linear mixed models by \(L_1\)-penalized estimation
- scientific article; zbMATH DE number 1759568 (Why is no real title available?)
- CatBoost — An Ensemble Machine Learning Model for Prediction and Classification of Student Academic Performance
- Boosting as a kernel-based method
- On mathematical modelling of synthetic measures
- Bagging tree classifiers for glaucoma diagnosis
- Multivariate data analysis and modeling through classification and regression trees
- Boosting additive models using component-wise P-splines
- Noisy replication in skewed binary classification.
- AdaBoost and robust one-bit compressed sensing
- Theory of Classification: a Survey of Some Recent Advances
- Optimization by Gradient Boosting
- A precise high-dimensional asymptotic theory for boosting and minimum-\(\ell_1\)-norm interpolated classifiers
- Detection of outliers in geochemical data using ensembles of subsets of variables
- Statistical modeling: The two cultures. (With comments and a rejoinder).
- Extremely randomized trees
- Generalised indirect classifiers
- A conversation with Larry Brown
- Adjusting the outputs of a classifier to new a priori probabilities: A simple procedure
- A conversation with Jerry Friedman
- Attractor networks for shape recognition
- Semiparametric regression during 2003--2007
- Sparse projection oblique randomer forests
- Objective model selection with parallel genetic algorithms using an eradication strategy
- Different Paradigms for Choosing Sequential Reweighting Algorithms
- A note on margin-based loss functions in classification
- Improving nonparametric regression methods by bagging and boosting.
- A new accelerated proximal boosting machine with convergence rate \(O(1/t^2)\)
- Bundling classifiers by bagging trees
- Boosting the margin: a new explanation for the effectiveness of voting methods
- Classification by ensembles from random partitions of high-dimensional data
- Additive logistic regression: a statistical view of boosting. (With discussion and a rejoinder by the authors)
- Trimmed bagging
- scientific article; zbMATH DE number 7307480 (Why is no real title available?)
- Learning model trees from evolving data streams
- Hellinger distance decision trees are robust and skew-insensitive
- New aspects of Bregman divergence in regression and classification with parametric and nonparametric estimation
- Double-bagging: Combining classifiers by bootstrap aggregation
- Boosting in the presence of outliers: adaptive classification with nonconvex loss functions
- A nonlinear aggregation type classifier
- Interpreting neural-network results: a simulation study.
- Delta Boosting Machine with Application to General Insurance
- Remembering Leo Breiman
- Remembrance of Leo Breiman
- Bounding the generalization error of convex combinations of classifiers: Balancing the dimensionality and the margins.
- Time series forecasting with multiple candidate models: selecting or combining?
- Forecasting China's foreign trade volume with a kernel-based hybrid econometric-AI ensemble learning approach
- On the Bayes-risk consistency of regularized boosting methods.
- SVM-boosting based on Markov resampling: theory and algorithm
- Analyzing bagging
- A Bayesian Random Split to Build Ensembles of Classification Trees
- Variance reduction trends on `boosted' classifiers
- Machine learning acceleration for nonlinear solvers applied to multiphase porous media flow
- Statistical behavior and consistency of classification methods based on convex risk minimization.
- A statistical approach to growing a reliable honest tree.
- Boosting and instability for regression trees
- General sparse boosting: improving feature selection of \(L_{2}\) boosting by correlation-based penalty family
- Quadratic boosting
- Structural, Syntactic, and Statistical Pattern Recognition
- An empirical comparison of ensemble methods based on classification trees
- Experimental study for the comparison of classifier combination methods
- Application of “Aggregated Classifiers” in Survival Time Studies
- Boosting for high-dimensional linear models
- Quantum adiabatic machine learning
- Boosting with Noisy Data: Some Views from Statistical Theory
- Vote counting measures for ensemble classifiers.
- Aggregating classifiers with ordinal response structure
- Classifying G-protein coupled receptors with bagging classification tree
- Risk bounds for CART classifiers under a margin condition
- An empirical bias–variance analysis of DECORATE ensemble method at different training sample sizes
- Gradient boosting for linear mixed models
- Bandwidth choice for nonparametric classification
- Boosting in Cox regression: a comparison between the likelihood-based and the model-based approaches with focus on the R-packages \textit{CoxBoost} and \textit{mboost}
- A conversation with Leo Breiman.
- Classification by evolutionary ensembles
- Boosting with early stopping: convergence and consistency
- Diversification for better classification trees
- Bias-corrected random forests in regression
- Accelerated gradient boosting
- Out-of-bag estimation of the optimal sample size in bagging
- Machine learning feature analysis illuminates disparity between E3SM climate models and observed climate change
- Assessing the stability of classification trees using Florida birth data
- Using LogitBoost classifier to predict protein structural classes
This page was built for publication: Arcing classifiers. (With discussion)
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q1807115)