Additive logistic regression: a statistical view of boosting. (With discussion and a rejoinder by the authors)

From MaRDI portal
Revision as of 11:00, 1 February 2024 by Import240129110113 (talk | contribs) (Created automatically from import240129110113)
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)

Publication:1848780

DOI10.1214/AOS/1016218223zbMath1106.62323OpenAlexW2024046085WikidataQ93494458 ScholiaQ93494458MaRDI QIDQ1848780

Robert Tibshirani, Trevor Hastie, Jerome H. Friedman

Publication date: 14 November 2002

Published in: The Annals of Statistics (Search for Journal in Brave)

Full work available at URL: https://projecteuclid.org/euclid.aos/1016218223




Related Items (only showing first 100 items - show all)

Two-step sparse boosting for high-dimensional longitudinal data with varying coefficientsLearning ELM-tree from big data based on uncertainty reductionRobust variable selection with exponential squared loss for the spatial autoregressive modelCost-sensitive ensemble learning: a unifying frameworkOn the accuracy of cross-validation in the classification problemRepresentation in the (artificial) immune systemFinding causative genes from high-dimensional data: an appraisal of statistical and machine learning approachesSome relationships between fuzzy and random set-based classifiers and modelsAn empirical comparison of classification algorithms for mortgage default prediction: evidence from a distressed mortgage marketLogistic regression using covariates obtained by product-unit neural network modelsCost-sensitive boosting algorithms: do we really need them?Tutorial series on brain-inspired computing. VI: Geometrical structure of boosting algorithmMean and quantile boosting for partially linear additive modelsImproved nearest neighbor classifiers by weighting and selection of predictorsLogitboost autoregressive networksGradient boosting for high-dimensional prediction of rare eventsImproved customer choice predictions using ensemble methods\(L_{2}\) boosting in kernel regressionSelf-improved gaps almost everywhere for the agnostic approximation of monomialsA Fisher consistent multiclass loss function with variable margin on positive examplesSmall area estimation of the homeless in Los Angeles: an application of cost-sensitive stochastic gradient boostingOn hybrid classification using model assisted posterior estimatesBlasso for object categorization and retrieval: towards interpretable visual modelsComplexity in the case against accuracy estimationA simple extension of boosting for asymmetric mislabeled dataFurther results on the margin explanation of boosting: new algorithm and experimentsSurvival ensembles by the sum of pairwise differences with application to lung cancer microarray studiesKullback-Leibler aggregation and misspecified generalized linear modelsBoosting algorithms: regularization, prediction and model fittingComment on: Boosting algorithms: regularization, prediction and model fittingRejoinder: Boosting algorithms: regularization, prediction and model fittingA boosting method for maximization of the area under the ROC curveRepresenting and recognizing objects with massive local image patchesA boosting approach for supervised Mahalanobis distance metric learningFunctional gradient ascent for probit regressionA noise-detection based AdaBoost algorithm for mislabeled dataFully corrective boosting with arbitrary loss and regularizationVote counting measures for ensemble classifiers.On a method for constructing ensembles of regression modelsEntropy and divergence associated with power function and the statistical applicationSparse regression and support recovery with \(\mathbb{L}_2\)-boosting algorithmsBoosting multi-features with prior knowledge for mini unmanned helicopter landmark detectionA fast genetic method for inducting descriptive fuzzy models.Statistical modeling: The two cultures. (With comments and a rejoinder).A concrete statistical realization of Kleinberg's stochastic dicrimination for pattern recognition. I: Two-class classificationDoes modeling lead to more accurate classification? A study of relative efficiency in linear classificationA sharp nonasymptotic bound and phase diagram of \(L_{1/2}\) regularizationUnsupervised weight parameter estimation method for ensemble learningCox process functional learningBoosting conditional probability estimatorsCounting and enumerating aggregate classifiersSoft memberships for spectral clustering, with application to permeable language distinctionAn extensive comparison of recent classification tools applied to microarray dataBoosting and instability for regression treesBoosting additive models using component-wise P-splinesUsing boosting to prune double-bagging ensemblesAdditive prediction and boosting for functional dataBoosting nonlinear additive autoregressive time seriesTaxonomy for characterizing ensemble methods in classification tasks: a review and annotated bibliographyGreedy function approximation: A gradient boosting machine.Non-crossing large-margin probability estimation and its application to robust SVM via pre\-condi\-tion\-ingA cascade of boosted generative and discriminative classifiers for vehicle detectionModel-based boosting in R: a hands-on tutorial using the R package mboostRemembering Leo BreimanRemembrance of Leo BreimanNode harvestQuadratic Majorization for Nonconvex Loss with Applications to the Boosting AlgorithmRobust boosting with truncated loss functionsNavigating random forests and related advances in algorithmic modelingBoosting GARCH and neural networks for the prediction of heteroskedastic time seriesNearly unbiased variable selection under minimax concave penaltyCost-sensitive boosting for classification of imbalanced dataFunctional dissipation microarrays for classificationEstimating the dimension of a modelNew multicategory boosting algorithms based on multicategory Fisher-consistent lossesTree-structured modelling of categorical predictors in generalized additive regressionA new hybrid classification algorithm for customer churn prediction based on logistic regression and decision treesBoosting in Cox regression: a comparison between the likelihood-based and the model-based approaches with focus on the R-packages \textit{CoxBoost} and \textit{mboost}Optimal prediction poolsRobust exponential squared loss-based variable selection for high-dimensional single-index varying-coefficient modelObtaining linguistic fuzzy rule-based regression models from imprecise data with multiobjective genetic algorithmsEnsemble classification based on generalized additive modelsSketching information divergencesBoosted Bayesian network classifiersExact bootstrap \(k\)-nearest neighbor learnersSurrogate maximization/minimization algorithms and extensionsA dynamic model of expected bond returns: A functional gradient descent approachA weight-adjusted voting algorithm for ensembles of classifiersIterative bias reduction: a comparative studySoft-max boostingParallelizing AdaBoost by weights dynamicsBoosting ridge regressionA stochastic approximation view of boostingA local boosting algorithm for solving classification problemsLogitboost with errors-in-variablesA \(\mathbb R\)eal generalization of discrete AdaBoostOn boosting kernel regressionEmbedding ensemble tracking in a stochastic framework for robust object trackingA geometric approach to leveraging weak learnersA conversation with Jerry Friedman


Uses Software



Cites Work




This page was built for publication: Additive logistic regression: a statistical view of boosting. (With discussion and a rejoinder by the authors)