Additive logistic regression: a statistical view of boosting. (With discussion and a rejoinder by the authors)
DOI10.1214/AOS/1016218223zbMATH Open1106.62323OpenAlexW2024046085WikidataQ93494458 ScholiaQ93494458MaRDI QIDQ1848780FDOQ1848780
Authors: Robert Tibshirani, Jerome H. Friedman, Trevor Hastie
Publication date: 14 November 2002
Published in: The Annals of Statistics (Search for Journal in Brave)
Full work available at URL: https://projecteuclid.org/euclid.aos/1016218223
Recommendations
- Boosting in structured additive models.
- Rejoinder: Boosting algorithms: regularization, prediction and model fitting
- Generalized Additive Modeling with Implicit Variable Selection by Likelihood‐Based Boosting
- Additive prediction and boosting for functional data
- Comment on: Boosting algorithms: regularization, prediction and model fitting
- Mean and quantile boosting for partially linear additive models
Nonparametric regression and quantile regression (62G08) Classification and discrimination; cluster analysis (statistical aspects) (62H30) Learning and adaptive systems in artificial intelligence (68T05)
Cites Work
- A decision-theoretic generalization of on-line learning and an application to boosting
- Title not available (Why is that?)
- Title not available (Why is that?)
- Title not available (Why is that?)
- Bagging predictors
- Multivariate adaptive regression splines
- Bayesian backfitting. (With comments and a rejoinder).
- A Bayesian CART algorithm
- Robust Estimation of a Location Parameter
- Nearest neighbor pattern classification
- On bagging and nonlinear estimation
- Boosting the margin: a new explanation for the effectiveness of voting methods
- Bayesian curve-fitting with free-knot splines
- Flexible Discriminant Analysis by Optimal Scoring
- Matching pursuits with time-frequency dictionaries
- An adaptive version of the boost by majority algorithm
- Arcing classifiers. (With discussion)
- Very simple classification rules perform well on most commonly used datasets
- Improved boosting algorithms using confidence-rated predictions
- Soft margins for AdaBoost
- Title not available (Why is that?)
- Linear smoothers and additive models
- Classification by pairwise coupling
- Boosting a weak learning algorithm by majority
- A theory of the learnable
- Using iterated bagging to debias regressions
- Title not available (Why is that?)
- Boosting first-order learning
- Variance reduction trends on `boosted' classifiers
Cited In (only showing first 100 items - show all)
- A generic path algorithm for regularized statistical estimation
- Non-crossing large-margin probability estimation and its application to robust SVM via pre\-condi\-tion\-ing
- Beyond sequential covering -- boosted decision rules
- Remembering Leo Breiman
- Remembrance of Leo Breiman
- A simple extension of boosting for asymmetric mislabeled data
- A unified classification model based on robust optimization
- New multicategory boosting algorithms based on multicategory Fisher-consistent losses
- Aggregating classifiers with ordinal response structure
- Boosting in Cox regression: a comparison between the likelihood-based and the model-based approaches with focus on the R-packages \textit{CoxBoost} and \textit{mboost}
- Title not available (Why is that?)
- Further results on the margin explanation of boosting: new algorithm and experiments
- Obtaining linguistic fuzzy rule-based regression models from imprecise data with multiobjective genetic algorithms
- Detection of differential item functioning in Rasch models by boosting techniques
- Rejoinder: Boosting algorithms: regularization, prediction and model fitting
- A noise-detection based AdaBoost algorithm for mislabeled data
- Functional gradient ascent for probit regression
- Density estimation with stagewise optimization of the empirical risk
- Boosting nonlinear additive autoregressive time series
- Title not available (Why is that?)
- Title not available (Why is that?)
- Boosted Bayesian network classifiers
- Boosting ridge regression
- Robust exponential squared loss-based variable selection for high-dimensional single-index varying-coefficient model
- \(L_{2}\) boosting in kernel regression
- Improved customer choice predictions using ensemble methods
- Fully corrective boosting with arbitrary loss and regularization
- Stochastic boosting algorithms
- Stochastic boosting algorithms
- On a method for constructing ensembles of regression models
- Looking for lumps: boosting and bagging for density estimation.
- Boosting in the presence of noise
- A weight-adjusted voting algorithm for ensembles of classifiers
- Soft-max boosting
- An extensive comparison of recent classification tools applied to microarray data
- Exact bootstrap \(k\)-nearest neighbor learners
- On boosting kernel regression
- Sparse regression and support recovery with \(\mathbb{L}_2\)-boosting algorithms
- A local boosting algorithm for solving classification problems
- Soft memberships for spectral clustering, with application to permeable language distinction
- Multinomial logit models with implicit variable selection
- Designing a boosted classifier on Riemannian manifolds
- Population theory for boosting ensembles.
- A sharp nonasymptotic bound and phase diagram of \(L_{1/2}\) regularization
- Process consistency for AdaBoost.
- Boosted classification trees and class probability/quantile estimation
- Survival ensembles by the sum of pairwise differences with application to lung cancer microarray studies
- Regularized Bayesian quantile regression
- A boosting approach for supervised Mahalanobis distance metric learning
- Representing and recognizing objects with massive local image patches
- Recursive aggregation of estimators by the mirror descent algorithm with averaging
- Robust Loss Functions for Boosting
- Deformation of log-likelihood loss function for multiclass boosting
- Cox process functional learning
- Supervised projection approach for boosting classifiers
- Analysis of boosting algorithms using the smooth margin function
- Optimal rates of aggregation in classification under low noise assumption
- Unsupervised weight parameter estimation method for ensemble learning
- Boosting conditional probability estimators
- Bootstrap -- an exploration
- Robust MAVE for single-index varying-coefficient models
- Boosting method for nonlinear transformation models with censored survival data
- Least angle regression. (With discussion)
- Robust Boosting Algorithm Against Mislabeling in Multiclass Problems
- A conversation with Jerry Friedman
- Semiparametric regression during 2003--2007
- Improving nonparametric regression methods by bagging and boosting.
- Does modeling lead to more accurate classification? A study of relative efficiency in linear classification
- Additive prediction and boosting for functional data
- Boosting in the presence of outliers: adaptive classification with nonconvex loss functions
- New aspects of Bregman divergence in regression and classification with parametric and nonparametric estimation
- Blasso for object categorization and retrieval: towards interpretable visual models
- Information Geometry of U-Boost and Bregman Divergence
- Robust Variable Selection With Exponential Squared Loss
- The boosting approach to machine learning: an overview
- A geometric approach to leveraging weak learners
- On the Bayes-risk consistency of regularized boosting methods.
- Modeling threshold interaction effects through the logistic classification trunk
- Boosting iterative stochastic ensemble method for nonlinear calibration of subsurface flow models
- Three categories customer churn prediction based on the adjusted real AdaBoost
- Iterative bias reduction: a comparative study
- Transformation boosting machines
- Learning ELM-tree from big data based on uncertainty reduction
- Statistical behavior and consistency of classification methods based on convex risk minimization.
- Quadratic boosting
- A dynamic model of expected bond returns: A functional gradient descent approach
- Boosting and instability for regression trees
- Improved boosting algorithms using confidence-rated predictions
- Robustifying AdaBoost by Adding the Naive Error Rate
- Experiments with AdaBoost.RT, an Improved Boosting Scheme for Regression
- Boosting for high-dimensional linear models
- Multicategory large margin classification methods: hinge losses vs. coherence functions
- A likelihood-based boosting algorithm for factor analysis models with binary data
- Bandwidth choice for nonparametric classification
- Boosting with early stopping: convergence and consistency
- Robust estimation for the varying coefficient partially nonlinear models
- Accelerated gradient boosting
- Finding causative genes from high-dimensional data: an appraisal of statistical and machine learning approaches
- Entropy and divergence associated with power function and the statistical application
- Title not available (Why is that?)
Uses Software
This page was built for publication: Additive logistic regression: a statistical view of boosting. (With discussion and a rejoinder by the authors)
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q1848780)