Additive logistic regression: a statistical view of boosting. (With discussion and a rejoinder by the authors)
From MaRDI portal
(Redirected from Publication:1848780)
Recommendations
- Boosting in structured additive models.
- Rejoinder: Boosting algorithms: regularization, prediction and model fitting
- Generalized Additive Modeling with Implicit Variable Selection by Likelihood‐Based Boosting
- Additive prediction and boosting for functional data
- Comment on: Boosting algorithms: regularization, prediction and model fitting
- Mean and quantile boosting for partially linear additive models
Cites work
- scientific article; zbMATH DE number 3860199 (Why is no real title available?)
- scientific article; zbMATH DE number 47282 (Why is no real title available?)
- scientific article; zbMATH DE number 47310 (Why is no real title available?)
- scientific article; zbMATH DE number 1219016 (Why is no real title available?)
- scientific article; zbMATH DE number 3793774 (Why is no real title available?)
- A Bayesian CART algorithm
- A decision-theoretic generalization of on-line learning and an application to boosting
- A theory of the learnable
- An adaptive version of the boost by majority algorithm
- Arcing classifiers. (With discussion)
- Bagging predictors
- Bayesian backfitting. (With comments and a rejoinder).
- Bayesian curve-fitting with free-knot splines
- Boosting a weak learning algorithm by majority
- Boosting first-order learning
- Boosting the margin: a new explanation for the effectiveness of voting methods
- Classification by pairwise coupling
- Flexible Discriminant Analysis by Optimal Scoring
- Improved boosting algorithms using confidence-rated predictions
- Linear smoothers and additive models
- Matching pursuits with time-frequency dictionaries
- Multivariate adaptive regression splines
- Nearest neighbor pattern classification
- On bagging and nonlinear estimation
- Robust Estimation of a Location Parameter
- Soft margins for AdaBoost
- Using iterated bagging to debias regressions
- Variance reduction trends on `boosted' classifiers
- Very simple classification rules perform well on most commonly used datasets
Cited in
(only showing first 100 items - show all)- Statistical modeling: The two cultures. (With comments and a rejoinder).
- Listwise approaches based on feature ranking discovery
- Logistic model trees
- Counting and enumerating aggregate classifiers
- Least angle regression. (With discussion)
- Confidence sets with expected sizes for multiclass classification
- A conversation with Jerry Friedman
- Large dimensional analysis of general margin based classification methods
- Regularized Estimation in the Accelerated Failure Time Model with High-Dimensional Covariates
- A hybrid generalized propensity score approach for observational studies
- Robust Boosting Algorithm Against Mislabeling in Multiclass Problems
- scientific article; zbMATH DE number 1931832 (Why is no real title available?)
- Semiparametric regression during 2003--2007
- On the effect of obesity on employment in the presence of observed and unobserved confounding
- Stratified normalization logitboost for two-class unbalanced data classification
- Non-crossing large-margin probability estimation and its application to robust SVM via pre\-condi\-tion\-ing
- Weighted bagging: a modification of AdaBoost from the perspective of importance sampling
- Different Paradigms for Choosing Sequential Reweighting Algorithms
- Improving nonparametric regression methods by bagging and boosting.
- A new accelerated proximal boosting machine with convergence rate \(O(1/t^2)\)
- Machine learning for corporate default risk: multi-period prediction, frailty correlation, loan portfolios, and tail probabilities
- A generic path algorithm for regularized statistical estimation
- Stochastic approximation: from statistical origin to big-data, multidisciplinary applications
- Does modeling lead to more accurate classification? A study of relative efficiency in linear classification
- Robust variable selection with exponential squared loss for the spatial autoregressive model
- Cost-sensitive ensemble learning: a unifying framework
- On the accuracy of cross-validation in the classification problem
- Parallelizing AdaBoost by weights dynamics
- Logitboost with errors-in-variables
- Additive prediction and boosting for functional data
- Instance-dependent cost-sensitive learning for detecting transfer fraud
- A comparative study of the leading machine learning techniques and two new optimization algorithms
- On the Effect and Remedies of Shrinkage on Classification Probability Estimation
- Adaptive sampling for large scale boosting
- Effect of data preprocessing on ensemble learning for classification in disease diagnosis
- Blasso for object categorization and retrieval: towards interpretable visual models
- New aspects of Bregman divergence in regression and classification with parametric and nonparametric estimation
- A Hybrid Approach of Boosting Against Noisy Data
- Boosting in the presence of outliers: adaptive classification with nonconvex loss functions
- A cascade of boosted generative and discriminative classifiers for vehicle detection
- scientific article; zbMATH DE number 1945802 (Why is no real title available?)
- Representation in the (artificial) immune system
- STATISTICALLY VALIDATED LEAD-LAG NETWORKS AND INVENTORY PREDICTION IN THE FOREIGN EXCHANGE MARKET
- Composite large margin classifiers with latent subclasses for heterogeneous biomedical data
- Skills in demand for ICT and statistical occupations: Evidence from web‐based job vacancies
- Delta Boosting Machine with Application to General Insurance
- Remembering Leo Breiman
- Remembrance of Leo Breiman
- Information Geometry of U-Boost and Bregman Divergence
- Top-down decision tree learning as information based boosting
- A geometric approach to leveraging weak learners
- Two-step sparse boosting for high-dimensional longitudinal data with varying coefficients
- An update on statistical boosting in biomedicine
- On the Bayes-risk consistency of regularized boosting methods.
- Dimension reduction boosting
- SVM-boosting based on Markov resampling: theory and algorithm
- Robust Variable Selection With Exponential Squared Loss
- Beyond sequential covering -- boosted decision rules
- The boosting approach to machine learning: an overview
- Modeling threshold interaction effects through the logistic classification trunk
- A simple extension of boosting for asymmetric mislabeled data
- Variance reduction trends on `boosted' classifiers
- Statistical monitoring of nominal logistic profiles in phase II
- Calibrating AdaBoost for phoneme classification
- Navigating random forests and related advances in algorithmic modeling
- Pseudo-value regression trees
- An incremental aggregated proximal ADMM for linearly constrained nonconvex optimization with application to sparse logistic regression problems
- Boosting iterative stochastic ensemble method for nonlinear calibration of subsurface flow models
- A boosting first-hitting-time model for survival analysis in high-dimensional settings
- Iterative bias reduction: a comparative study
- Random gradient boosting for predicting conditional quantiles
- Local uncertainty sampling for large-scale multiclass logistic regression
- Learning ELM-tree from big data based on uncertainty reduction
- Three categories customer churn prediction based on the adjusted real AdaBoost
- Transformation boosting machines
- Appropriate machine learning techniques for credit scoring and bankruptcy prediction in banking and finance: A comparative study
- Statistical behavior and consistency of classification methods based on convex risk minimization.
- Response versus gradient boosting trees, GLMs and neural networks under Tweedie loss and log-link
- New Bootstrap Applications in Supervised Learning
- A dynamic model of expected bond returns: A functional gradient descent approach
- Ensemble of fast learning stochastic gradient boosting
- Boosting and instability for regression trees
- Big data analytics for seismic fracture identification using amplitude-based statistics
- General sparse boosting: improving feature selection of \(L_{2}\) boosting by correlation-based penalty family
- A unified classification model based on robust optimization
- Quadratic boosting
- Boosting GARCH and neural networks for the prediction of heteroskedastic time series
- Statistical Learning With Time Series Dependence: An Application to Scoring Sleep in Mice
- Improved boosting algorithms using confidence-rated predictions
- Evolution of high-frequency systematic trading: a performance-driven gradient boosting model
- A calibrated multiclass extension of AdaBoost
- New multicategory boosting algorithms based on multicategory Fisher-consistent losses
- Logistic regression using covariates obtained by product-unit neural network models
- Some accelerated alternating proximal gradient algorithms for a class of nonconvex nonsmooth problems
- Tutorial series on brain-inspired computing. VI: Geometrical structure of boosting algorithm
- Subject-specific Bradley–Terry–Luce models with implicit variable selection
- Robust estimation in partially linear regression models
- Boosting Distributional Copula Regression
- Robustifying AdaBoost by Adding the Naive Error Rate
- Component-wise AdaBoost algorithms for high-dimensional binary classification and class probability prediction
This page was built for publication: Additive logistic regression: a statistical view of boosting. (With discussion and a rejoinder by the authors)
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q1848780)