Additive logistic regression: a statistical view of boosting. (With discussion and a rejoinder by the authors)
DOI10.1214/AOS/1016218223zbMATH Open1106.62323OpenAlexW2024046085WikidataQ93494458 ScholiaQ93494458MaRDI QIDQ1848780FDOQ1848780
Authors: Robert Tibshirani, Jerome H. Friedman, Trevor Hastie
Publication date: 14 November 2002
Published in: The Annals of Statistics (Search for Journal in Brave)
Full work available at URL: https://projecteuclid.org/euclid.aos/1016218223
Recommendations
- Boosting in structured additive models.
- Rejoinder: Boosting algorithms: regularization, prediction and model fitting
- Generalized Additive Modeling with Implicit Variable Selection by Likelihood‐Based Boosting
- Additive prediction and boosting for functional data
- Comment on: Boosting algorithms: regularization, prediction and model fitting
- Mean and quantile boosting for partially linear additive models
Nonparametric regression and quantile regression (62G08) Classification and discrimination; cluster analysis (statistical aspects) (62H30) Learning and adaptive systems in artificial intelligence (68T05)
Cites Work
- A decision-theoretic generalization of on-line learning and an application to boosting
- Title not available (Why is that?)
- Title not available (Why is that?)
- Title not available (Why is that?)
- Bagging predictors
- Multivariate adaptive regression splines
- Bayesian backfitting. (With comments and a rejoinder).
- A Bayesian CART algorithm
- Robust Estimation of a Location Parameter
- Nearest neighbor pattern classification
- On bagging and nonlinear estimation
- Boosting the margin: a new explanation for the effectiveness of voting methods
- Bayesian curve-fitting with free-knot splines
- Flexible Discriminant Analysis by Optimal Scoring
- Matching pursuits with time-frequency dictionaries
- An adaptive version of the boost by majority algorithm
- Arcing classifiers. (With discussion)
- Very simple classification rules perform well on most commonly used datasets
- Improved boosting algorithms using confidence-rated predictions
- Soft margins for AdaBoost
- Title not available (Why is that?)
- Linear smoothers and additive models
- Classification by pairwise coupling
- Boosting a weak learning algorithm by majority
- A theory of the learnable
- Using iterated bagging to debias regressions
- Title not available (Why is that?)
- Boosting first-order learning
- Variance reduction trends on `boosted' classifiers
Cited In (only showing first 100 items - show all)
- A generic path algorithm for regularized statistical estimation
- Non-crossing large-margin probability estimation and its application to robust SVM via pre\-condi\-tion\-ing
- Beyond sequential covering -- boosted decision rules
- Remembering Leo Breiman
- Remembrance of Leo Breiman
- A simple extension of boosting for asymmetric mislabeled data
- A unified classification model based on robust optimization
- New multicategory boosting algorithms based on multicategory Fisher-consistent losses
- Aggregating classifiers with ordinal response structure
- Boosting in Cox regression: a comparison between the likelihood-based and the model-based approaches with focus on the R-packages \textit{CoxBoost} and \textit{mboost}
- Title not available (Why is that?)
- Further results on the margin explanation of boosting: new algorithm and experiments
- Obtaining linguistic fuzzy rule-based regression models from imprecise data with multiobjective genetic algorithms
- Detection of differential item functioning in Rasch models by boosting techniques
- Rejoinder: Boosting algorithms: regularization, prediction and model fitting
- A noise-detection based AdaBoost algorithm for mislabeled data
- Functional gradient ascent for probit regression
- Density estimation with stagewise optimization of the empirical risk
- Boosting nonlinear additive autoregressive time series
- Title not available (Why is that?)
- Title not available (Why is that?)
- Boosted Bayesian network classifiers
- Boosting ridge regression
- Robust exponential squared loss-based variable selection for high-dimensional single-index varying-coefficient model
- \(L_{2}\) boosting in kernel regression
- Improved customer choice predictions using ensemble methods
- Fully corrective boosting with arbitrary loss and regularization
- Stochastic boosting algorithms
- Stochastic boosting algorithms
- On a method for constructing ensembles of regression models
- Looking for lumps: boosting and bagging for density estimation.
- Boosting in the presence of noise
- A weight-adjusted voting algorithm for ensembles of classifiers
- Soft-max boosting
- An extensive comparison of recent classification tools applied to microarray data
- Exact bootstrap \(k\)-nearest neighbor learners
- On boosting kernel regression
- Sparse regression and support recovery with \(\mathbb{L}_2\)-boosting algorithms
- A local boosting algorithm for solving classification problems
- Soft memberships for spectral clustering, with application to permeable language distinction
- Multinomial logit models with implicit variable selection
- Designing a boosted classifier on Riemannian manifolds
- Population theory for boosting ensembles.
- A sharp nonasymptotic bound and phase diagram of \(L_{1/2}\) regularization
- Process consistency for AdaBoost.
- Boosted classification trees and class probability/quantile estimation
- Survival ensembles by the sum of pairwise differences with application to lung cancer microarray studies
- Regularized Bayesian quantile regression
- A boosting approach for supervised Mahalanobis distance metric learning
- Representing and recognizing objects with massive local image patches
- Recursive aggregation of estimators by the mirror descent algorithm with averaging
- Robust Loss Functions for Boosting
- Deformation of log-likelihood loss function for multiclass boosting
- Cox process functional learning
- Supervised projection approach for boosting classifiers
- Analysis of boosting algorithms using the smooth margin function
- Optimal rates of aggregation in classification under low noise assumption
- Unsupervised weight parameter estimation method for ensemble learning
- Boosting conditional probability estimators
- Bootstrap -- an exploration
- Robust MAVE for single-index varying-coefficient models
- Boosting method for nonlinear transformation models with censored survival data
- Counting and enumerating aggregate classifiers
- Robust variable selection with exponential squared loss for the spatial autoregressive model
- Cost-sensitive ensemble learning: a unifying framework
- On the accuracy of cross-validation in the classification problem
- Parallelizing AdaBoost by weights dynamics
- Logitboost with errors-in-variables
- A cascade of boosted generative and discriminative classifiers for vehicle detection
- Representation in the (artificial) immune system
- Two-step sparse boosting for high-dimensional longitudinal data with varying coefficients
- An update on statistical boosting in biomedicine
- Navigating random forests and related advances in algorithmic modeling
- General sparse boosting: improving feature selection of \(L_{2}\) boosting by correlation-based penalty family
- Boosting GARCH and neural networks for the prediction of heteroskedastic time series
- Logistic regression using covariates obtained by product-unit neural network models
- Tutorial series on brain-inspired computing. VI: Geometrical structure of boosting algorithm
- Boosting with Noisy Data: Some Views from Statistical Theory
- Covariate balancing propensity score by tailored loss functions
- Vote counting measures for ensemble classifiers.
- Functional dissipation microarrays for classification
- A concrete statistical realization of Kleinberg's stochastic dicrimination for pattern recognition. I: Two-class classification
- Probing for sparse and fast variable selection with model-based boosting
- Sample size determination for logistic regression
- Using LogitBoost classifier to predict protein structural classes
- Boosted Regression Trees with Errors in Variables
- Sketching information divergences
- Surrogate maximization/minimization algorithms and extensions
- Logitboost autoregressive networks
- A fast genetic method for inducting descriptive fuzzy models.
- Multi-class boosting with asymmetric binary weak-learners
- Self-improved gaps almost everywhere for the agnostic approximation of monomials
- A Fisher consistent multiclass loss function with variable margin on positive examples
- The synergy between PAV and AdaBoost
- Gradient boosting for high-dimensional prediction of rare events
- Noise peeling methods to improve boosting algorithms
- Adaptive index models for marker-based risk stratification
- A \(\mathbb R\)eal generalization of discrete AdaBoost
- Complexity in the case against accuracy estimation
- Using boosting to prune double-bagging ensembles
Uses Software
This page was built for publication: Additive logistic regression: a statistical view of boosting. (With discussion and a rejoinder by the authors)
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q1848780)