Additive logistic regression: a statistical view of boosting. (With discussion and a rejoinder by the authors)
DOI10.1214/AOS/1016218223zbMATH Open1106.62323OpenAlexW2024046085WikidataQ93494458 ScholiaQ93494458MaRDI QIDQ1848780FDOQ1848780
Authors: Robert Tibshirani, Jerome H. Friedman, Trevor Hastie
Publication date: 14 November 2002
Published in: The Annals of Statistics (Search for Journal in Brave)
Full work available at URL: https://projecteuclid.org/euclid.aos/1016218223
Recommendations
- Boosting in structured additive models.
- Rejoinder: Boosting algorithms: regularization, prediction and model fitting
- Generalized Additive Modeling with Implicit Variable Selection by Likelihood‐Based Boosting
- Additive prediction and boosting for functional data
- Comment on: Boosting algorithms: regularization, prediction and model fitting
- Mean and quantile boosting for partially linear additive models
Nonparametric regression and quantile regression (62G08) Classification and discrimination; cluster analysis (statistical aspects) (62H30) Learning and adaptive systems in artificial intelligence (68T05)
Cites Work
- A decision-theoretic generalization of on-line learning and an application to boosting
- Title not available (Why is that?)
- Title not available (Why is that?)
- Title not available (Why is that?)
- Bagging predictors
- Multivariate adaptive regression splines
- Bayesian backfitting. (With comments and a rejoinder).
- A Bayesian CART algorithm
- Robust Estimation of a Location Parameter
- Nearest neighbor pattern classification
- On bagging and nonlinear estimation
- Boosting the margin: a new explanation for the effectiveness of voting methods
- Bayesian curve-fitting with free-knot splines
- Flexible Discriminant Analysis by Optimal Scoring
- Matching pursuits with time-frequency dictionaries
- An adaptive version of the boost by majority algorithm
- Arcing classifiers. (With discussion)
- Very simple classification rules perform well on most commonly used datasets
- Improved boosting algorithms using confidence-rated predictions
- Soft margins for AdaBoost
- Title not available (Why is that?)
- Linear smoothers and additive models
- Classification by pairwise coupling
- Boosting a weak learning algorithm by majority
- A theory of the learnable
- Using iterated bagging to debias regressions
- Title not available (Why is that?)
- Boosting first-order learning
- Variance reduction trends on `boosted' classifiers
Cited In (only showing first 100 items - show all)
- Least angle regression. (With discussion)
- Robust Boosting Algorithm Against Mislabeling in Multiclass Problems
- A conversation with Jerry Friedman
- Semiparametric regression during 2003--2007
- Improving nonparametric regression methods by bagging and boosting.
- Does modeling lead to more accurate classification? A study of relative efficiency in linear classification
- Additive prediction and boosting for functional data
- Boosting in the presence of outliers: adaptive classification with nonconvex loss functions
- New aspects of Bregman divergence in regression and classification with parametric and nonparametric estimation
- Blasso for object categorization and retrieval: towards interpretable visual models
- Information Geometry of U-Boost and Bregman Divergence
- Robust Variable Selection With Exponential Squared Loss
- The boosting approach to machine learning: an overview
- A geometric approach to leveraging weak learners
- On the Bayes-risk consistency of regularized boosting methods.
- Modeling threshold interaction effects through the logistic classification trunk
- Boosting iterative stochastic ensemble method for nonlinear calibration of subsurface flow models
- Three categories customer churn prediction based on the adjusted real AdaBoost
- Iterative bias reduction: a comparative study
- Transformation boosting machines
- Learning ELM-tree from big data based on uncertainty reduction
- Statistical behavior and consistency of classification methods based on convex risk minimization.
- Quadratic boosting
- A dynamic model of expected bond returns: A functional gradient descent approach
- Boosting and instability for regression trees
- Improved boosting algorithms using confidence-rated predictions
- Robustifying AdaBoost by Adding the Naive Error Rate
- Experiments with AdaBoost.RT, an Improved Boosting Scheme for Regression
- Boosting for high-dimensional linear models
- Multicategory large margin classification methods: hinge losses vs. coherence functions
- A likelihood-based boosting algorithm for factor analysis models with binary data
- Bandwidth choice for nonparametric classification
- Boosting with early stopping: convergence and consistency
- Robust estimation for the varying coefficient partially nonlinear models
- Accelerated gradient boosting
- Finding causative genes from high-dimensional data: an appraisal of statistical and machine learning approaches
- Entropy and divergence associated with power function and the statistical application
- Title not available (Why is that?)
- Boosting in structured additive models.
- An asymmetric adaptive classification method
- Comment on: Boosting algorithms: regularization, prediction and model fitting
- An empirical comparison of classification algorithms for mortgage default prediction: evidence from a distressed mortgage market
- Variable Selection and Model Choice in Geoadditive Regression Models
- Model-based boosting in R: a hands-on tutorial using the R package mboost
- Cost-sensitive boosting algorithms: do we really need them?
- Small area estimation of the homeless in Los Angeles: an application of cost-sensitive stochastic gradient boosting
- Variable selection by ensembles for the Cox model
- Nearly unbiased variable selection under minimax concave penalty
- Greedy function approximation: A gradient boosting machine.
- A new hybrid classification algorithm for customer churn prediction based on logistic regression and decision trees
- Tree-structured modelling of categorical predictors in generalized additive regression
- Ensemble classification based on generalized additive models
- Quadratic Majorization for Nonconvex Loss with Applications to the Boosting Algorithm
- Robust boosting with truncated loss functions
- Improved nearest neighbor classifiers by weighting and selection of predictors
- Mean and quantile boosting for partially linear additive models
- On hybrid classification using model assisted posterior estimates
- Taxonomy for characterizing ensemble methods in classification tasks: a review and annotated bibliography
- Forecasting with many predictors: is boosting a viable alternative?
- Kullback-Leibler aggregation and misspecified generalized linear models
- Cost-sensitive boosting for classification of imbalanced data
- Cost-sensitive learning and decision making revisited
- Boosting algorithms: regularization, prediction and model fitting
- Simultaneous adaptation to the margin and to complexity in classification
- Random classification noise defeats all convex potential boosters
- Randomized Gradient Boosting Machine
- Nonparametric multiple expectile regression via ER-Boost
- A stochastic approximation view of boosting
- Generalized Additive Modeling with Implicit Variable Selection by Likelihood‐Based Boosting
- Estimating the dimension of a model
- Node harvest
- Stochastic gradient boosting.
- A combination selection algorithm on forecasting
- A boosting method for maximization of the area under the ROC curve
- Optimization by Gradient Boosting
- Regression trees for predicting mortality in patients with cardiovascular disease: what improvement is achieved by using ensemble-based methods?
- Optimal prediction pools
- Theory of Classification: a Survey of Some Recent Advances
- Logistic model trees
- Boosting additive models using component-wise P-splines
- Statistical modeling: The two cultures. (With comments and a rejoinder).
- Regularized Estimation in the Accelerated Failure Time Model with High-Dimensional Covariates
- A generic path algorithm for regularized statistical estimation
- Non-crossing large-margin probability estimation and its application to robust SVM via pre\-condi\-tion\-ing
- Beyond sequential covering -- boosted decision rules
- Remembering Leo Breiman
- Remembrance of Leo Breiman
- A simple extension of boosting for asymmetric mislabeled data
- A unified classification model based on robust optimization
- New multicategory boosting algorithms based on multicategory Fisher-consistent losses
- Aggregating classifiers with ordinal response structure
- Boosting in Cox regression: a comparison between the likelihood-based and the model-based approaches with focus on the R-packages \textit{CoxBoost} and \textit{mboost}
- Title not available (Why is that?)
- Further results on the margin explanation of boosting: new algorithm and experiments
- Obtaining linguistic fuzzy rule-based regression models from imprecise data with multiobjective genetic algorithms
- Detection of differential item functioning in Rasch models by boosting techniques
- Rejoinder: Boosting algorithms: regularization, prediction and model fitting
- A noise-detection based AdaBoost algorithm for mislabeled data
- Functional gradient ascent for probit regression
- Density estimation with stagewise optimization of the empirical risk
Uses Software
This page was built for publication: Additive logistic regression: a statistical view of boosting. (With discussion and a rejoinder by the authors)
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q1848780)