Boosting algorithms: regularization, prediction and model fitting
From MaRDI portal
Publication:449780
DOI10.1214/07-STS242zbMath1246.62163arXiv0804.2752OpenAlexW3099723433MaRDI QIDQ449780
Torsten Hothorn, Peter Bühlmann
Publication date: 1 September 2012
Published in: Statistical Science (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/0804.2752
survival analysisvariable selectiongeneralized linear modelsgeneralized additive modelsgradient boosting\texttt{mboost}
Software, source code, etc. for problems pertaining to statistics (62-04) Generalized linear models (logistic models) (62J12) Estimation in survival analysis and censored data (62N02)
Related Items
Two-step sparse boosting for high-dimensional longitudinal data with varying coefficients, Early stopping for statistical inverse problems via truncated SVD estimation, Sequential double cross-validation for assessment of added predictive ability in high-dimensional omic applications, Penalized likelihood and Bayesian function selection in regression models, A review on instance ranking problems in statistical learning, Variable selection in general multinomial logit models, Boosting multi-state models, Boosting techniques for nonlinear time series models, A unified framework of constrained regression, Wavelet-based gradient boosting, Survival Regression with Accelerated Failure Time Model in XGBoost, Feature selection filter for classification of power system operating states, Toward Efficient Ensemble Learning with Structure Constraints: Convergent Algorithms and Applications, An empirical comparison of classification algorithms for mortgage default prediction: evidence from a distressed mortgage market, A precise high-dimensional asymptotic theory for boosting and minimum-\(\ell_1\)-norm interpolated classifiers, A Tree-Based Semi-Varying Coefficient Model for the COM-Poisson Distribution, PBoostGA: pseudo-boosting genetic algorithm for variable ranking and selection, Regression trees for predicting mortality in patients with cardiovascular disease: What improvement is achieved by using ensemble-based methods?, Boosting functional response models for location, scale and shape with an application to bacterial competition, Mean and quantile boosting for partially linear additive models, Improved nearest neighbor classifiers by weighting and selection of predictors, Probing for sparse and fast variable selection with model-based boosting, Machine learning based on extended generalized linear model applied in mixture experiments, Small area estimation of the homeless in Los Angeles: an application of cost-sensitive stochastic gradient boosting, Boosting kernel-based dimension reduction for jointly propagating spatial variability and parameter uncertainty in long-running flow simulators, Variable selection for generalized linear mixed models by \(L_1\)-penalized estimation, Bayesian variable selection and estimation in semiparametric joint models of multivariate longitudinal and survival data, Accelerated gradient boosting, A simple extension of boosting for asymmetric mislabeled data, Forecasting with many predictors: is boosting a viable alternative?, Ensemble classification of paired data, To explain or to predict?, Geoadditive expectile regression, Boosting flexible functional regression models with a high number of functional historical effects, Marginal integration for nonparametric causal inference, Gradient boosting for distributional regression: faster tuning and improved variable selection via noncyclical updates, Empirical risk minimization is optimal for the convex aggregation problem, Semiparametric regression during 2003--2007, Survival ensembles by the sum of pairwise differences with application to lung cancer microarray studies, Generalised joint regression for count data: a penalty extension for competitive settings, Discriminant analyses of peanut allergy severity scores, Functional gradient ascent for probit regression, RandGA: injecting randomness into parallel genetic algorithm for variable selection, Invariance, causality and robustness, Boosting iterative stochastic ensemble method for nonlinear calibration of subsurface flow models, Generalized additive models with unknown link function including variable selection, A general framework for functional regression modelling, Comparison and contrast of two general functional regression modelling frameworks, Boosting for statistical modelling-A non-technical introduction, Predicting matches in international football tournaments with random forests, An overview of techniques for linking high‐dimensional molecular data to time‐to‐event endpoints by risk prediction models, CAM: causal additive models, high-dimensional order search and penalized regression, Rank-based estimation in the ℓ1-regularized partly linear model for censored outcomes with application to integrated analyses of clinical predictors and gene expression data, Predicting the Whole Distribution with Methods for Depth Data Analysis Demonstrated on a Colorectal Cancer Treatment Study, Spike-and-Slab Priors for Function Selection in Structured Additive Regression Models, Use of pretransformation to cope with extreme values in important candidate features, Boosting additive models using component-wise P-splines, Boosting nonlinear additive autoregressive time series, Boosted coefficient models, Ridge estimation for multinomial logit models with symmetric side constraints, Variable selection and model choice in structured survival models, Multinomial logit models with implicit variable selection, Model-based boosting in R: a hands-on tutorial using the R package mboost, Quadratic Majorization for Nonconvex Loss with Applications to the Boosting Algorithm, Robust boosting with truncated loss functions, The reliability of classification of terminal nodes in GUIDE decision tree to predict the nonalcoholic fatty liver disease, Determining cutoff point of ensemble trees based on sample size in predicting clinical dose with DNA microarray data, On the choice and influence of the number of boosting steps for high-dimensional linear Cox-models, Delta Boosting Machine with Application to General Insurance, New multicategory boosting algorithms based on multicategory Fisher-consistent losses, SEMIPARAMETRIC REGRESSION AND GRAPHICAL MODELS, Generalized additive models with flexible response functions, Entropy-based estimation in classification problems, Boosting in Cox regression: a comparison between the likelihood-based and the model-based approaches with focus on the R-packages \textit{CoxBoost} and \textit{mboost}, Boosting high dimensional predictive regressions with time varying parameters, Nonparametric estimation of the link function including variable selection, Transformation boosting machines, Inference for \(L_2\)-boosting, Beyond mean regression, The functional linear array model, Subject-specific Bradley–Terry–Luce models with implicit variable selection, Boosted nonparametric hazards with time-dependent covariates, Automatic model selection for high-dimensional survival analysis, Regularized proportional odds models, Nonparametric multiple expectile regression via ER-Boost, Random gradient boosting for predicting conditional quantiles, Dimension reduction boosting, Variable Selection and Model Choice in Geoadditive Regression Models, Generalized Additive Models for Pair-Copula Constructions, Three Categories Customer Churn Prediction Based on the Adjusted Real Adaboost, Sparse kernel deep stacking networks, Asymptotic linear expansion of regularized M-estimators, Some challenges for statistics, Detection of differential item functioning in Rasch models by boosting techniques, High-dimensional additive modeling, Unnamed Item, Boosting as a kernel-based method, Adaptive step-length selection in gradient boosting for Gaussian location and scale models, A penalty approach to differential item functioning in Rasch models, General Sparse Boosting: Improving Feature Selection of L2Boosting by Correlation-Based Penalty Family, De-noising boosting methods for variable selection and estimation subject to error-prone variables, On the selection of predictors by using greedy algorithms and information theoretic criteria, Boosting Distributional Copula Regression, Explainable subgradient tree boosting for prescriptive analytics in operations management, Accelerated Componentwise Gradient Boosting Using Efficient Data Representation and Momentum-Based Optimization, Gradient boosting with extreme-value theory for wildfire prediction, A boosting first-hitting-time model for survival analysis in high-dimensional settings, Prediction of sports injuries in football: a recurrent time-to-event approach using regularized Cox models, On the relevance of prognostic information for clinical trials: A theoretical quantification, Estimating global and country-specific excess mortality during the COVID-19 pandemic, Estimation and inference of treatment effects with \(L_2\)-boosting in high-dimensional settings, Data-driven state-of-charge prediction of a storage cell using ABC/GBRT, ABC/MLP and Lasso machine learning techniques, Quantitative robustness of instance ranking problems, Unbiased Boosting Estimation for Censored Survival Data, Modeling Postoperative Mortality in Older Patients by Boosting Discrete-Time Competing Risks Models, Privacy-preserving and lossless distributed estimation of high-dimensional generalized additive mixed models, Stochastic boosting algorithms, Stochastic Approximation Boosting for Incomplete Data Problems, Stochastic boosting algorithms, Optimization by Gradient Boosting, Modelling Flow in Gas Transmission Networks Using Shape-Constrained Expectile Regression
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Greedy function approximation: A gradient boosting machine.
- Bagging predictors
- Multi-class AdaBoost
- The Adaptive Lasso and Its Oracle Properties
- ElemStatLearn
- Knot selection by boosting techniques
- Boosting ridge regression
- On boosting kernel regression
- Generalized additive models
- A decision-theoretic generalization of on-line learning and an application to boosting
- BoosTexter: A boosting-based system for text categorization
- Persistene in high-dimensional linear predictor-selection and the virtue of overparametrization
- Arcing classifiers. (With discussion)
- Boosting the margin: a new explanation for the effectiveness of voting methods
- Additive logistic regression: a statistical view of boosting. (With discussion and a rejoinder by the authors)
- Empirical margin distributions and bounding the generalization error of combined classifiers
- Unified methods for censored longitudinal data and causality
- Least angle regression. (With discussion)
- Process consistency for AdaBoost.
- On the Bayes-risk consistency of regularized boosting methods.
- Weak greedy algorithms
- A multivariate FGD technique to improve VaR computation in equity markets
- Boosting for high-dimensional linear models
- High-dimensional graphs and variable selection with the Lasso
- Boosting with early stopping: convergence and consistency
- On early stopping in gradient descent learning
- Better Subset Regression Using the Nonnegative Garrote
- Survival ensembles
- Smoothing Parameter Selection in Nonparametric Regression Using an Improved Akaike Information Criterion
- Cryptographic limitations on learning Boolean formulae and finite automata
- Model Selection and the Principle of Minimum Description Length
- Boosting With theL2Loss
- A new approach to variable selection in least squares problems
- Aggregating classifiers with ordinal response structure
- 10.1162/1532443041424319
- 10.1162/153244304773936108
- Matching pursuits with time-frequency dictionaries
- Generalized Additive Modeling with Implicit Variable Selection by Likelihood‐Based Boosting
- Convergence Rates of General Regularization Methods for Statistical Inverse Problems and Applications
- Generalized monotonic regression based on B-splines with an application to air pollution data
- Convexity, Classification, and Risk Bounds
- Soft margins for AdaBoost
- The elements of statistical learning. Data mining, inference, and prediction
- Random forests
- Looking for lumps: boosting and bagging for density estimation.