Additive logistic regression: a statistical view of boosting. (With discussion and a rejoinder by the authors)
DOI10.1214/AOS/1016218223zbMATH Open1106.62323OpenAlexW2024046085WikidataQ93494458 ScholiaQ93494458MaRDI QIDQ1848780FDOQ1848780
Authors: Robert Tibshirani, Jerome H. Friedman, Trevor Hastie
Publication date: 14 November 2002
Published in: The Annals of Statistics (Search for Journal in Brave)
Full work available at URL: https://projecteuclid.org/euclid.aos/1016218223
Recommendations
- Boosting in structured additive models.
- Rejoinder: Boosting algorithms: regularization, prediction and model fitting
- Generalized Additive Modeling with Implicit Variable Selection by Likelihood‐Based Boosting
- Additive prediction and boosting for functional data
- Comment on: Boosting algorithms: regularization, prediction and model fitting
- Mean and quantile boosting for partially linear additive models
Nonparametric regression and quantile regression (62G08) Classification and discrimination; cluster analysis (statistical aspects) (62H30) Learning and adaptive systems in artificial intelligence (68T05)
Cites Work
- A decision-theoretic generalization of on-line learning and an application to boosting
- Title not available (Why is that?)
- Title not available (Why is that?)
- Title not available (Why is that?)
- Bagging predictors
- Multivariate adaptive regression splines
- Bayesian backfitting. (With comments and a rejoinder).
- A Bayesian CART algorithm
- Robust Estimation of a Location Parameter
- Nearest neighbor pattern classification
- On bagging and nonlinear estimation
- Boosting the margin: a new explanation for the effectiveness of voting methods
- Bayesian curve-fitting with free-knot splines
- Flexible Discriminant Analysis by Optimal Scoring
- Matching pursuits with time-frequency dictionaries
- An adaptive version of the boost by majority algorithm
- Arcing classifiers. (With discussion)
- Very simple classification rules perform well on most commonly used datasets
- Improved boosting algorithms using confidence-rated predictions
- Soft margins for AdaBoost
- Title not available (Why is that?)
- Linear smoothers and additive models
- Classification by pairwise coupling
- Boosting a weak learning algorithm by majority
- A theory of the learnable
- Using iterated bagging to debias regressions
- Title not available (Why is that?)
- Boosting first-order learning
- Variance reduction trends on `boosted' classifiers
Cited In (only showing first 100 items - show all)
- Title not available (Why is that?)
- A new accelerated proximal boosting machine with convergence rate \(O(1/t^2)\)
- Machine learning for corporate default risk: multi-period prediction, frailty correlation, loan portfolios, and tail probabilities
- On the Effect and Remedies of Shrinkage on Classification Probability Estimation
- Stochastic approximation: from statistical origin to big-data, multidisciplinary applications
- Instance-dependent cost-sensitive learning for detecting transfer fraud
- A comparative study of the leading machine learning techniques and two new optimization algorithms
- STATISTICALLY VALIDATED LEAD-LAG NETWORKS AND INVENTORY PREDICTION IN THE FOREIGN EXCHANGE MARKET
- Dimension reduction boosting
- Top-down decision tree learning as information based boosting
- SVM-boosting based on Markov resampling: theory and algorithm
- Variance reduction trends on `boosted' classifiers
- Calibrating AdaBoost for phoneme classification
- Local uncertainty sampling for large-scale multiclass logistic regression
- Ensemble of fast learning stochastic gradient boosting
- Some accelerated alternating proximal gradient algorithms for a class of nonconvex nonsmooth problems
- Evolution of high-frequency systematic trading: a performance-driven gradient boosting model
- Big data analytics for seismic fracture identification using amplitude-based statistics
- Subject-specific Bradley–Terry–Luce models with implicit variable selection
- Robust estimation in partially linear regression models
- A fast algorithm for the accelerated failure time model with high-dimensional time-to-event data
- Nonparametric Modeling of Neural Point Processes via Stochastic Gradient Boosting Regression
- Automatic model selection for high-dimensional survival analysis
- Combining biomarkers to optimize patient treatment recommendations
- Learning causal effect using machine learning with application to China's typhoon
- Cost-sensitive multi-class AdaBoost for understanding driving behavior based on telematics
- Boosting-based sequential output prediction
- Multiclass Boosting Algorithms for Shrinkage Estimators of Class Probability
- Data reduction using a discrete wavelet transform in discriminant analysis of very high dimensionality data
- Title not available (Why is that?)
- Model-based transductive learning of the kernel matrix
- Logistic model trees
- An empirical study of using Rotation Forest to improve regressors
- Isotonic boosting classification rules
- Boosting high dimensional predictive regressions with time varying parameters
- A Statistical Approach to Crime Linkage
- BoostWofE: a new sequential weights of evidence model reducing the effect of conditional dependency
- Gradient boosting for distributional regression: faster tuning and improved variable selection via noncyclical updates
- Pathway-based kernel boosting for the analysis of genome-wide association studies
- Multilogistic regression by means of evolutionary product-unit neural networks
- Probability estimation for multi-class classification using adaboost
- Multiclass boosting: margins, codewords, losses, and algorithms
- Fast convergence rates of deep neural networks for classification
- Toward an explainable machine learning model for claim frequency: a use case in car insurance pricing with telematics data
- Uncertainty and forecasts of U.S. recessions
- Tests of the martingale difference hypothesis using boosting and RBF neural network approximations
- Discriminative reranking for natural language parsing
- Probability estimation with machine learning methods for dichotomous and multicategory outcome: theory
- Machine learning based on extended generalized linear model applied in mixture experiments
- A probabilistic classifier ensemble weighting scheme based on cross-validated accuracy estimates
- An empirical comparison of learning algorithms for nonparametric scoring: the \textsc{TreeRank} algorithm and other methods
- Local fractal and multifractal features for volumic texture characterization
- Adaptive step-length selection in gradient boosting for Gaussian location and scale models
- AdaBoost and robust one-bit compressed sensing
- A precise high-dimensional asymptotic theory for boosting and minimum-\(\ell_1\)-norm interpolated classifiers
- Confidence sets with expected sizes for multiclass classification
- Counting and enumerating aggregate classifiers
- Robust variable selection with exponential squared loss for the spatial autoregressive model
- Cost-sensitive ensemble learning: a unifying framework
- On the accuracy of cross-validation in the classification problem
- Parallelizing AdaBoost by weights dynamics
- Logitboost with errors-in-variables
- A cascade of boosted generative and discriminative classifiers for vehicle detection
- Representation in the (artificial) immune system
- Two-step sparse boosting for high-dimensional longitudinal data with varying coefficients
- An update on statistical boosting in biomedicine
- Navigating random forests and related advances in algorithmic modeling
- General sparse boosting: improving feature selection of \(L_{2}\) boosting by correlation-based penalty family
- Boosting GARCH and neural networks for the prediction of heteroskedastic time series
- Logistic regression using covariates obtained by product-unit neural network models
- Tutorial series on brain-inspired computing. VI: Geometrical structure of boosting algorithm
- Boosting with Noisy Data: Some Views from Statistical Theory
- Covariate balancing propensity score by tailored loss functions
- Vote counting measures for ensemble classifiers.
- Functional dissipation microarrays for classification
- A concrete statistical realization of Kleinberg's stochastic dicrimination for pattern recognition. I: Two-class classification
- Probing for sparse and fast variable selection with model-based boosting
- Sample size determination for logistic regression
- Using LogitBoost classifier to predict protein structural classes
- Boosted Regression Trees with Errors in Variables
- Sketching information divergences
- Surrogate maximization/minimization algorithms and extensions
- Logitboost autoregressive networks
- A fast genetic method for inducting descriptive fuzzy models.
- Multi-class boosting with asymmetric binary weak-learners
- Self-improved gaps almost everywhere for the agnostic approximation of monomials
- A Fisher consistent multiclass loss function with variable margin on positive examples
- The synergy between PAV and AdaBoost
- Gradient boosting for high-dimensional prediction of rare events
- Noise peeling methods to improve boosting algorithms
- Adaptive index models for marker-based risk stratification
- A \(\mathbb R\)eal generalization of discrete AdaBoost
- Complexity in the case against accuracy estimation
- Using boosting to prune double-bagging ensembles
- On weak base hypotheses and their implications for boosting regression and classification
- Embedding ensemble tracking in a stochastic framework for robust object tracking
- Title not available (Why is that?)
- Some relationships between fuzzy and random set-based classifiers and models
- Boosted kernel ridge regression: optimal learning rates and early stopping
- Boosting multi-features with prior knowledge for mini unmanned helicopter landmark detection
Uses Software
This page was built for publication: Additive logistic regression: a statistical view of boosting. (With discussion and a rejoinder by the authors)
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q1848780)