Arcing classifiers. (With discussion)
From MaRDI portal
Recommendations
Cites work
- scientific article; zbMATH DE number 3860199 (Why is no real title available?)
- scientific article; zbMATH DE number 1149409 (Why is no real title available?)
- scientific article; zbMATH DE number 1149421 (Why is no real title available?)
- scientific article; zbMATH DE number 784362 (Why is no real title available?)
- scientific article; zbMATH DE number 823069 (Why is no real title available?)
- A Recursive Partitioning Decision Rule for Nonparametric Classification
- A conversation with Leo Breiman.
- A decision-theoretic generalization of on-line learning and an application to boosting
- A theory of the learnable
- Bagging predictors
- Boosting a weak learning algorithm by majority
- Boosting the margin: a new explanation for the effectiveness of voting methods
- Cryptographic limitations on learning Boolean formulae and finite automata
- Estimation of dependences based on empirical data. Transl. from the Russian by Samuel Kotz
- Heuristics of instability and stabilization in model selection
Cited in
(only showing first 100 items - show all)- Ensemble Subset Regression (ENSURE): Efficient High-dimensional Prediction
- Boosting the margin: a new explanation for the effectiveness of voting methods
- Stochastic boosting algorithms
- Stochastic boosting algorithms
- Model-based boosting in R: a hands-on tutorial using the R package mboost
- Accelerated gradient boosting
- Improving nonparametric regression methods by bagging and boosting.
- Boosting algorithms: regularization, prediction and model fitting
- A nonlinear aggregation type classifier
- An extensive comparison of recent classification tools applied to microarray data
- Bundling classifiers by bagging trees
- A bootstrap-based aggregate classifier for model-based clustering
- Bias-corrected random forests in regression
- Aggregating classifiers with ordinal response structure
- Variable selection for generalized linear mixed models by \(L_1\)-penalized estimation
- Boosting additive models using component-wise P-splines
- Pruning of error correcting output codes by optimization of accuracy-diversity trade off
- Risk bounds for CART classifiers under a margin condition
- Boosting with early stopping: convergence and consistency
- A conversation with Leo Breiman.
- Nonparametric multiple expectile regression via ER-Boost
- Accurate tree-based missing data imputation and data fusion within the statistical learning paradigm
- Complexities of convex combinations and bounding the generalization error in classification
- Additive logistic regression: a statistical view of boosting. (With discussion and a rejoinder by the authors)
- A conversation with Larry Brown
- Double-bagging: Combining classifiers by bootstrap aggregation
- Bandwidth choice for nonparametric classification
- Remembering Leo Breiman
- Remembrance of Leo Breiman
- Taxonomy for characterizing ensemble methods in classification tasks: a review and annotated bibliography
- Learning model trees from evolving data streams
- Hellinger distance decision trees are robust and skew-insensitive
- A local boosting algorithm for solving classification problems
- Statistical behavior and consistency of classification methods based on convex risk minimization.
- A weight-adjusted voting algorithm for ensembles of classifiers
- Extremely randomized trees
- On the Bayes-risk consistency of regularized boosting methods.
- Trimmed bagging
- \(L_{2}\) boosting in kernel regression
- Experimental study for the comparison of classifier combination methods
- An empirical comparison of ensemble methods based on classification trees
- Analysis of boosting algorithms using the smooth margin function
- Forecasting China's foreign trade volume with a kernel-based hybrid econometric-AI ensemble learning approach
- Process consistency for AdaBoost.
- A conversation with Jerry Friedman
- Survival ensembles by the sum of pairwise differences with application to lung cancer microarray studies
- Out-of-bag estimation of the optimal sample size in bagging
- Boosting in the presence of outliers: adaptive classification with nonconvex loss functions
- scientific article; zbMATH DE number 1759568 (Why is no real title available?)
- Boosting for high-dimensional linear models
- Comment on: Boosting algorithms: regularization, prediction and model fitting
- Statistical modeling: The two cultures. (With comments and a rejoinder).
- A note on margin-based loss functions in classification
- Population theory for boosting ensembles.
- New aspects of Bregman divergence in regression and classification with parametric and nonparametric estimation
- Semiparametric regression during 2003--2007
- Comparing boosting and bagging for decision trees of rankings
- Using boosting to prune double-bagging ensembles
- Multivariate data analysis and modeling through classification and regression trees
- Optimization by Gradient Boosting
- Boosting in Cox regression: a comparison between the likelihood-based and the model-based approaches with focus on the R-packages \textit{CoxBoost} and \textit{mboost}
- Quadratic boosting
- Analyzing bagging
- A statistical approach to growing a reliable honest tree.
- Quantum adiabatic machine learning
- Boosting conditional probability estimators
- Improved customer choice predictions using ensemble methods
- Theory of Classification: a Survey of Some Recent Advances
- Density estimation with stagewise optimization of the empirical risk
- A cooperative constructive method for neural networks for pattern recognition
- A novel sparse least squares support vector machines
- Extremely randomized trees
- Boosting and instability for regression trees
- Adjusting the outputs of a classifier to new a priori probabilities: A simple procedure
- Interpreting neural-network results: a simulation study.
- Statistical uncertainty estimation using random forests and its application to drought forecast
- Time series forecasting with multiple candidate models: selecting or combining?
- Noisy replication in skewed binary classification.
- Canonical forest
- Generalised indirect classifiers
- AdaBoost and robust one-bit compressed sensing
- Attractor networks for shape recognition
- Objective model selection with parallel genetic algorithms using an eradication strategy
- Different Paradigms for Choosing Sequential Reweighting Algorithms
- Bagging tree classifiers for glaucoma diagnosis
- Quantum AdaBoost algorithm via cluster state
- Bounding the generalization error of convex combinations of classifiers: Balancing the dimensionality and the margins.
- Prediction of wind loading on masked angle members in lattice tower structures
- Classification by evolutionary ensembles
- Classification by ensembles from random partitions of high-dimensional data
- General sparse boosting: improving feature selection of \(L_{2}\) boosting by correlation-based penalty family
- Insurance Premium Prediction via Gradient Tree-Boosted Tweedie Compound Poisson Models
- A distributed algorithm for high-dimension convex quadratically constrained quadratic programs
- Tweedie gradient boosting for extremely unbalanced zero-inflated data
- Machine learning feature analysis illuminates disparity between E3SM climate models and observed climate change
- Boosting as a kernel-based method
- scientific article; zbMATH DE number 7307480 (Why is no real title available?)
- Sparse projection oblique randomer forests
- Gradient boosting for linear mixed models
- Modelling additive extremile regression by iteratively penalized least asymmetric weighted squares and gradient descent boosting
This page was built for publication: Arcing classifiers. (With discussion)
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q1807115)