Additive logistic regression: a statistical view of boosting. (With discussion and a rejoinder by the authors)
DOI10.1214/AOS/1016218223zbMATH Open1106.62323OpenAlexW2024046085WikidataQ93494458 ScholiaQ93494458MaRDI QIDQ1848780FDOQ1848780
Authors: Robert Tibshirani, Jerome H. Friedman, Trevor Hastie
Publication date: 14 November 2002
Published in: The Annals of Statistics (Search for Journal in Brave)
Full work available at URL: https://projecteuclid.org/euclid.aos/1016218223
Recommendations
- Boosting in structured additive models.
- Rejoinder: Boosting algorithms: regularization, prediction and model fitting
- Generalized Additive Modeling with Implicit Variable Selection by Likelihood‐Based Boosting
- Additive prediction and boosting for functional data
- Comment on: Boosting algorithms: regularization, prediction and model fitting
- Mean and quantile boosting for partially linear additive models
Nonparametric regression and quantile regression (62G08) Classification and discrimination; cluster analysis (statistical aspects) (62H30) Learning and adaptive systems in artificial intelligence (68T05)
Cites Work
- A decision-theoretic generalization of on-line learning and an application to boosting
- Title not available (Why is that?)
- Title not available (Why is that?)
- Title not available (Why is that?)
- Bagging predictors
- Multivariate adaptive regression splines
- Bayesian backfitting. (With comments and a rejoinder).
- A Bayesian CART algorithm
- Robust Estimation of a Location Parameter
- Nearest neighbor pattern classification
- On bagging and nonlinear estimation
- Boosting the margin: a new explanation for the effectiveness of voting methods
- Bayesian curve-fitting with free-knot splines
- Flexible Discriminant Analysis by Optimal Scoring
- Matching pursuits with time-frequency dictionaries
- An adaptive version of the boost by majority algorithm
- Arcing classifiers. (With discussion)
- Very simple classification rules perform well on most commonly used datasets
- Improved boosting algorithms using confidence-rated predictions
- Soft margins for AdaBoost
- Title not available (Why is that?)
- Linear smoothers and additive models
- Classification by pairwise coupling
- Boosting a weak learning algorithm by majority
- A theory of the learnable
- Using iterated bagging to debias regressions
- Title not available (Why is that?)
- Boosting first-order learning
- Variance reduction trends on `boosted' classifiers
Cited In (only showing first 100 items - show all)
- Title not available (Why is that?)
- A new accelerated proximal boosting machine with convergence rate \(O(1/t^2)\)
- Machine learning for corporate default risk: multi-period prediction, frailty correlation, loan portfolios, and tail probabilities
- On the Effect and Remedies of Shrinkage on Classification Probability Estimation
- Stochastic approximation: from statistical origin to big-data, multidisciplinary applications
- Instance-dependent cost-sensitive learning for detecting transfer fraud
- A comparative study of the leading machine learning techniques and two new optimization algorithms
- STATISTICALLY VALIDATED LEAD-LAG NETWORKS AND INVENTORY PREDICTION IN THE FOREIGN EXCHANGE MARKET
- Dimension reduction boosting
- Top-down decision tree learning as information based boosting
- SVM-boosting based on Markov resampling: theory and algorithm
- Variance reduction trends on `boosted' classifiers
- Calibrating AdaBoost for phoneme classification
- Local uncertainty sampling for large-scale multiclass logistic regression
- Ensemble of fast learning stochastic gradient boosting
- Some accelerated alternating proximal gradient algorithms for a class of nonconvex nonsmooth problems
- Evolution of high-frequency systematic trading: a performance-driven gradient boosting model
- Big data analytics for seismic fracture identification using amplitude-based statistics
- Subject-specific Bradley–Terry–Luce models with implicit variable selection
- Robust estimation in partially linear regression models
- A fast algorithm for the accelerated failure time model with high-dimensional time-to-event data
- Nonparametric Modeling of Neural Point Processes via Stochastic Gradient Boosting Regression
- Automatic model selection for high-dimensional survival analysis
- Combining biomarkers to optimize patient treatment recommendations
- Learning causal effect using machine learning with application to China's typhoon
- Cost-sensitive multi-class AdaBoost for understanding driving behavior based on telematics
- Boosting-based sequential output prediction
- Multiclass Boosting Algorithms for Shrinkage Estimators of Class Probability
- Data reduction using a discrete wavelet transform in discriminant analysis of very high dimensionality data
- Title not available (Why is that?)
- Model-based transductive learning of the kernel matrix
- Logistic model trees
- An empirical study of using Rotation Forest to improve regressors
- Isotonic boosting classification rules
- Boosting high dimensional predictive regressions with time varying parameters
- A Statistical Approach to Crime Linkage
- BoostWofE: a new sequential weights of evidence model reducing the effect of conditional dependency
- Gradient boosting for distributional regression: faster tuning and improved variable selection via noncyclical updates
- Pathway-based kernel boosting for the analysis of genome-wide association studies
- Multilogistic regression by means of evolutionary product-unit neural networks
- Probability estimation for multi-class classification using adaboost
- Multiclass boosting: margins, codewords, losses, and algorithms
- Fast convergence rates of deep neural networks for classification
- Toward an explainable machine learning model for claim frequency: a use case in car insurance pricing with telematics data
- Uncertainty and forecasts of U.S. recessions
- Tests of the martingale difference hypothesis using boosting and RBF neural network approximations
- Discriminative reranking for natural language parsing
- Probability estimation with machine learning methods for dichotomous and multicategory outcome: theory
- Machine learning based on extended generalized linear model applied in mixture experiments
- A probabilistic classifier ensemble weighting scheme based on cross-validated accuracy estimates
- An empirical comparison of learning algorithms for nonparametric scoring: the \textsc{TreeRank} algorithm and other methods
- Local fractal and multifractal features for volumic texture characterization
- Adaptive step-length selection in gradient boosting for Gaussian location and scale models
- AdaBoost and robust one-bit compressed sensing
- A precise high-dimensional asymptotic theory for boosting and minimum-\(\ell_1\)-norm interpolated classifiers
- Confidence sets with expected sizes for multiclass classification
- A generic path algorithm for regularized statistical estimation
- Non-crossing large-margin probability estimation and its application to robust SVM via pre\-condi\-tion\-ing
- Beyond sequential covering -- boosted decision rules
- Remembering Leo Breiman
- Remembrance of Leo Breiman
- A simple extension of boosting for asymmetric mislabeled data
- A unified classification model based on robust optimization
- New multicategory boosting algorithms based on multicategory Fisher-consistent losses
- Aggregating classifiers with ordinal response structure
- Boosting in Cox regression: a comparison between the likelihood-based and the model-based approaches with focus on the R-packages \textit{CoxBoost} and \textit{mboost}
- Title not available (Why is that?)
- Further results on the margin explanation of boosting: new algorithm and experiments
- Obtaining linguistic fuzzy rule-based regression models from imprecise data with multiobjective genetic algorithms
- Detection of differential item functioning in Rasch models by boosting techniques
- Rejoinder: Boosting algorithms: regularization, prediction and model fitting
- A noise-detection based AdaBoost algorithm for mislabeled data
- Functional gradient ascent for probit regression
- Density estimation with stagewise optimization of the empirical risk
- Boosting nonlinear additive autoregressive time series
- Title not available (Why is that?)
- Title not available (Why is that?)
- Boosted Bayesian network classifiers
- Boosting ridge regression
- Robust exponential squared loss-based variable selection for high-dimensional single-index varying-coefficient model
- \(L_{2}\) boosting in kernel regression
- Improved customer choice predictions using ensemble methods
- Fully corrective boosting with arbitrary loss and regularization
- Stochastic boosting algorithms
- Stochastic boosting algorithms
- On a method for constructing ensembles of regression models
- Looking for lumps: boosting and bagging for density estimation.
- Boosting in the presence of noise
- A weight-adjusted voting algorithm for ensembles of classifiers
- Soft-max boosting
- An extensive comparison of recent classification tools applied to microarray data
- Exact bootstrap \(k\)-nearest neighbor learners
- On boosting kernel regression
- Sparse regression and support recovery with \(\mathbb{L}_2\)-boosting algorithms
- A local boosting algorithm for solving classification problems
- Soft memberships for spectral clustering, with application to permeable language distinction
- Multinomial logit models with implicit variable selection
- Designing a boosted classifier on Riemannian manifolds
- Population theory for boosting ensembles.
- A sharp nonasymptotic bound and phase diagram of \(L_{1/2}\) regularization
Uses Software
This page was built for publication: Additive logistic regression: a statistical view of boosting. (With discussion and a rejoinder by the authors)
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q1848780)