Boosting algorithms: regularization, prediction and model fitting
DOI10.1214/07-STS242zbMATH Open1246.62163arXiv0804.2752OpenAlexW3099723433MaRDI QIDQ449780FDOQ449780
Torsten Hothorn, Peter Bühlmann
Publication date: 1 September 2012
Published in: Statistical Science (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/0804.2752
variable selectiongeneralized additive modelsgeneralized linear modelssurvival analysisgradient boosting\texttt{mboost}
Estimation in survival analysis and censored data (62N02) Software, source code, etc. for problems pertaining to statistics (62-04) Generalized linear models (logistic models) (62J12)
Cites Work
- Title not available (Why is that?)
- Title not available (Why is that?)
- Title not available (Why is that?)
- Title not available (Why is that?)
- Title not available (Why is that?)
- Title not available (Why is that?)
- Title not available (Why is that?)
- Title not available (Why is that?)
- Title not available (Why is that?)
- Title not available (Why is that?)
- Title not available (Why is that?)
- Title not available (Why is that?)
- Title not available (Why is that?)
- Title not available (Why is that?)
- Title not available (Why is that?)
- A decision-theoretic generalization of on-line learning and an application to boosting
- ElemStatLearn
- The elements of statistical learning. Data mining, inference, and prediction
- Greedy function approximation: A gradient boosting machine.
- The Adaptive Lasso and Its Oracle Properties
- Least angle regression. (With discussion)
- Random forests
- Bagging predictors
- High-dimensional graphs and variable selection with the Lasso
- Generalized additive models
- Persistene in high-dimensional linear predictor-selection and the virtue of overparametrization
- Boosting the margin: a new explanation for the effectiveness of voting methods
- Additive logistic regression: a statistical view of boosting. (With discussion and a rejoinder by the authors)
- Unified methods for censored longitudinal data and causality
- Boosting for high-dimensional linear models
- Survival ensembles
- Matching pursuits with time-frequency dictionaries
- Boosting with early stopping: convergence and consistency
- On early stopping in gradient descent learning
- Multi-class AdaBoost
- Arcing classifiers. (With discussion)
- Better Subset Regression Using the Nonnegative Garrote
- Boosting With theL2Loss
- A new approach to variable selection in least squares problems
- Generalized Additive Modeling with Implicit Variable Selection by Likelihood‐Based Boosting
- Generalized monotonic regression based on B-splines with an application to air pollution data
- Soft margins for AdaBoost
- Smoothing Parameter Selection in Nonparametric Regression Using an Improved Akaike Information Criterion
- Empirical margin distributions and bounding the generalization error of combined classifiers
- Model Selection and the Principle of Minimum Description Length
- Convexity, Classification, and Risk Bounds
- BoosTexter: A boosting-based system for text categorization
- Weak greedy algorithms
- Convergence Rates of General Regularization Methods for Statistical Inverse Problems and Applications
- Cryptographic limitations on learning Boolean formulae and finite automata
- On the Bayes-risk consistency of regularized boosting methods.
- Boosting ridge regression
- Process consistency for AdaBoost.
- Knot selection by boosting techniques
- On boosting kernel regression
- A multivariate FGD technique to improve VaR computation in equity markets
- Aggregating classifiers with ordinal response structure
- 10.1162/1532443041424319
- 10.1162/153244304773936108
- Looking for lumps: boosting and bagging for density estimation.
Cited In (only showing first 100 items - show all)
- Semiparametric regression during 2003--2007
- A penalty approach to differential item functioning in Rasch models
- Title not available (Why is that?)
- Sparse kernel deep stacking networks
- A unified framework of constrained regression
- Ridge estimation for multinomial logit models with symmetric side constraints
- Boosted coefficient models
- Model-based boosting 2.0
- A simple extension of boosting for asymmetric mislabeled data
- Boosting iterative stochastic ensemble method for nonlinear calibration of subsurface flow models
- Transformation boosting machines
- Invariance, causality and robustness
- New multicategory boosting algorithms based on multicategory Fisher-consistent losses
- Variable selection and model choice in structured survival models
- The functional linear array model
- Boosting multi-state models
- On the choice and influence of the number of boosting steps for high-dimensional linear Cox-models
- Boosting in Cox regression: a comparison between the likelihood-based and the model-based approaches with focus on the R-packages \textit{CoxBoost} and \textit{mboost}
- Accelerated gradient boosting
- Detection of differential item functioning in Rasch models by boosting techniques
- Functional gradient ascent for probit regression
- Boosting nonlinear additive autoregressive time series
- Stochastic Approximation Boosting for Incomplete Data Problems
- Feature selection filter for classification of power system operating states
- Stochastic boosting algorithms
- Stochastic boosting algorithms
- Boosting kernel-based dimension reduction for jointly propagating spatial variability and parameter uncertainty in long-running flow simulators
- Three Categories Customer Churn Prediction Based on the Adjusted Real Adaboost
- An empirical comparison of classification algorithms for mortgage default prediction: evidence from a distressed mortgage market
- Variable Selection and Model Choice in Geoadditive Regression Models
- Model-based boosting in R: a hands-on tutorial using the R package mboost
- Generalized Additive Models for Pair-Copula Constructions
- Small area estimation of the homeless in Los Angeles: an application of cost-sensitive stochastic gradient boosting
- Multinomial logit models with implicit variable selection
- PBoostGA: pseudo-boosting genetic algorithm for variable ranking and selection
- Quadratic Majorization for Nonconvex Loss with Applications to the Boosting Algorithm
- Robust boosting with truncated loss functions
- Spike-and-Slab Priors for Function Selection in Structured Additive Regression Models
- RandGA: injecting randomness into parallel genetic algorithm for variable selection
- A general framework for functional regression modelling
- Rank-based estimation in the ℓ1-regularized partly linear model for censored outcomes with application to integrated analyses of clinical predictors and gene expression data
- Improved nearest neighbor classifiers by weighting and selection of predictors
- Mean and quantile boosting for partially linear additive models
- Entropy-based estimation in classification problems
- CAM: causal additive models, high-dimensional order search and penalized regression
- Some challenges for statistics
- Marginal integration for nonparametric causal inference
- Empirical risk minimization is optimal for the convex aggregation problem
- Ensemble classification of paired data
- Forecasting with many predictors: is boosting a viable alternative?
- Survival ensembles by the sum of pairwise differences with application to lung cancer microarray studies
- High-dimensional additive modeling
- Boosting. Foundations and algorithms.
- Beyond mean regression
- Nonparametric multiple expectile regression via ER-Boost
- Generalized additive models with unknown link function including variable selection
- Generalized additive models with flexible response functions
- Geoadditive expectile regression
- Variable selection for generalized linear mixed models by \(L_1\)-penalized estimation
- Toward Efficient Ensemble Learning with Structure Constraints: Convergent Algorithms and Applications
- Boosting as a kernel-based method
- Penalized likelihood and Bayesian function selection in regression models
- Variable selection in general multinomial logit models
- Semiparametric regression and graphical models
- To explain or to predict?
- Optimization by Gradient Boosting
- Regression trees for predicting mortality in patients with cardiovascular disease: what improvement is achieved by using ensemble-based methods?
- Nonparametric estimation of the link function including variable selection
- Boosting additive models using component-wise P-splines
- Generalised joint regression for count data: a penalty extension for competitive settings
- Calibrating machine learning approaches for probability estimation: a comprehensive comparison
- Sample size and predictive performance of machine learning methods with survival data: a simulation study
- Buckley-James boosting model based on extreme learning machine and random survival forests
- Delta Boosting Machine with Application to General Insurance
- Dimension reduction boosting
- Forecasting retained earnings of privately held companies with PCA and \(L^1\) regression
- Pseudo-value regression trees
- Two-step sparse boosting for high-dimensional longitudinal data with varying coefficients
- Early stopping for statistical inverse problems via truncated SVD estimation
- A boosting first-hitting-time model for survival analysis in high-dimensional settings
- On the relevance of prognostic information for clinical trials: A theoretical quantification
- Random gradient boosting for predicting conditional quantiles
- Sequential double cross-validation for assessment of added predictive ability in high-dimensional omic applications
- Boosting Distributional Copula Regression
- Subject-specific Bradley–Terry–Luce models with implicit variable selection
- Prediction-based variable selection for component-wise gradient boosting
- Gradient boosting for linear mixed models
- Significance tests for boosted location and scale models with linear base-learners
- Automatic model selection for high-dimensional survival analysis
- A review on instance ranking problems in statistical learning
- Determining cutoff point of ensemble trees based on sample size in predicting clinical dose with DNA microarray data
- The reliability of classification of terminal nodes in GUIDE decision tree to predict the nonalcoholic fatty liver disease
- Wavelet-based gradient boosting
- Quantitative robustness of instance ranking problems
- De-noising boosting methods for variable selection and estimation subject to error-prone variables
- Conditional transformation models for survivor function estimation
- Boosting techniques for nonlinear time series models
- Boosting with missing predictors
- Predicting the Whole Distribution with Methods for Depth Data Analysis Demonstrated on a Colorectal Cancer Treatment Study
- Probing for sparse and fast variable selection with model-based boosting
Uses Software
Recommendations
- Title not available (Why is that?) 👍 👎
- Title not available (Why is that?) 👍 👎
- Comment: Boosting algorithms: regularization, prediction and model fitting 👍 👎
- Robust boosting for regression problems 👍 👎
- Boosting. Foundations and algorithms. 👍 👎
- Comment on: Boosting algorithms: regularization, prediction and model fitting 👍 👎
- Rejoinder: Boosting algorithms: regularization, prediction and model fitting 👍 👎
- Stochastic boosting algorithms 👍 👎
- Boosting methods for regression 👍 👎
- Stochastic boosting algorithms 👍 👎
This page was built for publication: Boosting algorithms: regularization, prediction and model fitting
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q449780)