Boosting a weak learning algorithm by majority
From MaRDI portal
Publication:1899915
DOI10.1006/INCO.1995.1136zbMath0833.68109OpenAlexW2070534370MaRDI QIDQ1899915
Publication date: 10 October 1995
Published in: Information and Computation (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1006/inco.1995.1136
Related Items (only showing first 100 items - show all)
Fully corrective gradient boosting with squared hinge: fast learning rates and early stopping ⋮ A boosting first-hitting-time model for survival analysis in high-dimensional settings ⋮ Downscaling shallow water simulations using artificial neural networks and boosted trees ⋮ Unbiased Boosting Estimation for Censored Survival Data ⋮ AdaBoost Semiparametric Model Averaging Prediction for Multiple Categories ⋮ Active classification using belief functions and information gain maximization ⋮ Conditional sparse boosting for high-dimensional instrumental variable estimation ⋮ A comparative study of machine learning models for predicting the state of reactive mixing ⋮ On the Bayes-risk consistency of regularized boosting methods. ⋮ Learning DNF in time \(2^{\widetilde O(n^{1/3})}\) ⋮ Consensus analysis of multiple classifiers using non-repetitive variables: diagnostic application to microarray gene expression data ⋮ Finding causative genes from high-dimensional data: an appraisal of statistical and machine learning approaches ⋮ Toward Efficient Ensemble Learning with Structure Constraints: Convergent Algorithms and Applications ⋮ On PAC learning algorithms for rich Boolean function classes ⋮ An overtraining-resistant stochastic modeling method for pattern recognition ⋮ A precise high-dimensional asymptotic theory for boosting and minimum-\(\ell_1\)-norm interpolated classifiers ⋮ An analysis of diversity measures ⋮ A general dimension for query learning ⋮ Boosting with Noisy Data: Some Views from Statistical Theory ⋮ Noise peeling methods to improve boosting algorithms ⋮ A decision-theoretic generalization of on-line learning and an application to boosting ⋮ Argumentation based reinforcement learning for meta-knowledge extraction ⋮ Ten More Years of Error Rate Research ⋮ A survey of deep network techniques all classifiers can adopt ⋮ Mixed-Integer Convex Nonlinear Optimization with Gradient-Boosted Trees Embedded ⋮ \(L_{2}\) boosting in kernel regression ⋮ Partial Occam's Razor and its applications ⋮ Probability estimation for multi-class classification using adaboost ⋮ An efficient membership-query algorithm for learning DNF with respect to the uniform distribution ⋮ Committee polyhedral separability: complexity and polynomial approximation ⋮ Deep learning of support vector machines with class probability output networks ⋮ PAC-Bayesian risk bounds for group-analysis sparse regression by exponential weighting ⋮ Deterministic Neural Classification ⋮ Preference disaggregation and statistical learning for multicriteria decision support: A review ⋮ A time-series modeling method based on the boosting gradient-descent theory ⋮ Accelerated gradient boosting ⋮ Sharp oracle inequalities for aggregation of affine estimators ⋮ Aggregation of estimators and stochastic optimization ⋮ Boosting local quasi-likelihood estimators ⋮ Teaching and Compressing for Low VC-Dimension ⋮ Recursive aggregation of estimators by the mirror descent algorithm with averaging ⋮ Semiparametric regression during 2003--2007 ⋮ Complexity of hard-core set proofs ⋮ Random classification noise defeats all convex potential boosters ⋮ Encoding Through Patterns: Regression Tree–Based Neuronal Population Models ⋮ The monotone theory for the PAC-model. ⋮ Scheme of boosting in the problems of combinatorial optimization induced by the collective training algorithms ⋮ Bayesian model averaging: A tutorial. (with comments and a rejoinder). ⋮ Locating Infinite Discontinuities in Computer Experiments ⋮ New degree bounds for polynomial threshold functions ⋮ Logical analysis of data as a tool for the analysis of probabilistic discrete choice behavior ⋮ Boosting method for nonlinear transformation models with censored survival data ⋮ Cox process functional learning ⋮ Learning unions of \(\omega(1)\)-dimensional rectangles ⋮ Multi-vehicle detection algorithm through combining Harr and HOG features ⋮ Maximum patterns in datasets ⋮ An efficient modified boosting method for solving classification problems ⋮ Model combination for credit risk assessment: a stacked generalization approach ⋮ GA-Ensemble: a genetic algorithm for robust ensembles ⋮ Remembering Leo Breiman ⋮ Algorithms for manipulation of level sets of nonparametric density estimates ⋮ FE-CIDIM: fast ensemble of CIDIM classifiers ⋮ A note on margin-based loss functions in classification ⋮ Concept lattice based composite classifiers for high predictability ⋮ Aggregating classifiers with ordinal response structure ⋮ Learning with continuous experts using drifting games ⋮ On XOR lemmas for the weight of polynomial threshold functions ⋮ Bayesian partition modelling. ⋮ BoostWofE: a new sequential weights of evidence model reducing the effect of conditional dependency ⋮ Delta Boosting Machine with Application to General Insurance ⋮ Seeing Inside the Black Box: Using Diffusion Index Methodology to Construct Factor Proxies in Large Scale Macroeconomic Time Series Environments ⋮ EROS: Ensemble rough subspaces ⋮ Multicategory large margin classification methods: hinge losses vs. coherence functions ⋮ On the fusion of threshold classifiers for categorization and dimensionality reduction ⋮ SEMIPARAMETRIC REGRESSION AND GRAPHICAL MODELS ⋮ Boosting in Cox regression: a comparison between the likelihood-based and the model-based approaches with focus on the R-packages \textit{CoxBoost} and \textit{mboost} ⋮ Theory of Classification: a Survey of Some Recent Advances ⋮ Boosting in the Presence of Outliers: Adaptive Classification With Nonconvex Loss Functions ⋮ Data gravitation based classification ⋮ Arcing classifiers. (With discussion) ⋮ Boosting the margin: a new explanation for the effectiveness of voting methods ⋮ Soft-max boosting ⋮ Boosted sparse nonlinear distance metric learning ⋮ Subject-specific Bradley–Terry–Luce models with implicit variable selection ⋮ SVM-boosting based on Markov resampling: theory and algorithm ⋮ A Fast Learning Algorithm for Deep Belief Nets ⋮ Classification tree analysis using TARGET ⋮ Hybrid classification algorithms based on boosting and support vector machines ⋮ Boosting in the presence of noise ⋮ Sampling from non-smooth distributions through Langevin diffusion ⋮ On boosting kernel regression ⋮ Additive logistic regression: a statistical view of boosting. (With discussion and a rejoinder by the authors) ⋮ On weak base hypotheses and their implications for boosting regression and classification ⋮ Unnamed Item ⋮ On the boosting ability of top-down decision tree learning algorithms ⋮ A new accelerated proximal boosting machine with convergence rate \(O(1/t^2)\) ⋮ On the perceptron's compression ⋮ A geometric approach to leveraging weak learners ⋮ Boosting using branching programs ⋮ Drifting games and Brownian motion
This page was built for publication: Boosting a weak learning algorithm by majority