gbm
From MaRDI portal
Gbm
An implementation of extensions to Freund and Schapire's AdaBoost algorithm and Friedman's gradient boosting machine. Includes regression methods for least squares, absolute loss, t-distribution loss, quantile regression, logistic, multinomial logistic, Poisson, Cox proportional hazards partial likelihood, AdaBoost exponential loss, Huberized hinge loss, and Learning to Rank measures (LambdaMart). Originally developed by Greg Ridgeway. Newer version available at github.com/gbm-developers/gbm3.
Cited in
(only showing first 100 items - show all)- Statistical learning from a regression perspective
- Cultural consensus theory for the evaluation of patients' mental health scores in forensic psychiatric hospitals
- Least angle regression. (With discussion)
- Runtime and memory consumption analyses for machine learning R programs
- Adaptive stochastic gradient boosting tree with composite criterion
- Comparing different propensity score estimation methods for estimating the marginal causal effect through standardization to propensity scores
- Computing AIC for black-box models using generalized degrees of freedom: A comparison with cross-validation
- Logitboost with errors-in-variables
- Boosted coefficient models
- Biased penalty calls in the National Hockey League
- Delta Boosting Machine with Application to General Insurance
- Presence‐Only Data and the EM Algorithm
- Arbitrage of forecasting experts
- Component-wise AdaBoost algorithms for high-dimensional binary classification and class probability prediction
- Determining cutoff point of ensemble trees based on sample size in predicting clinical dose with DNA microarray data
- Wavelet-based gradient boosting
- Predictive analytics of insurance claims using multivariate decision trees
- Boosting in Cox regression: a comparison between the likelihood-based and the model-based approaches with focus on the R-packages \textit{CoxBoost} and \textit{mboost}
- Optimization of tree ensembles
- Accelerated gradient boosting
- Statistical learning from a regression perspective
- Random forests, decision trees, and categorical predictors: the ``absent levels problem
- Estimator selection and combination in scalar-on-function regression
- Detecting the impact area of BP deepwater horizon oil discharge: an analysis by time varying coefficient logistic models and boosted trees
- MaJIC
- pls
- ROCR
- gam
- rpart
- mboost
- e1071
- caTools
- ISwR
- nnet
- randomForest
- rpart.plot
- adabag
- C50
- caret
- Kernlab
- Banjo
- ipred
- FindIt
- GAMBoost
- AdaBoost.MH
- twang
- SuperLearner
- PSMATCH2
- CASdatasets
- ada
- AppliedPredictiveModeling
- corrplot
- Cubist
- COBRA
- klaR
- CoxBoost
- alr4
- Bolstad
- Bolstad2
- DAAGxtras
- DCL
- strucplot
- ibr
- Epi
- GBMCI
- dmt
- See5
- evtree
- Quantregforest
- randomForestSRC
- gmb
- FNN
- extraTrees
- gamboostLSS
- speff2trial
- xgboost
- gbev
- PSF
- CBPS
- PresenceAbsence
- infotheo
- biomod2
- EnsembleBase
- Machine learning feature selection methods for landslide susceptibility mapping
- Poisson dependency networks: gradient boosted models for multivariate count data
- mma
- pqR
- MfUSampler
- Renjin
- Riposte
- pre
- verification
- CFC
- SemiParSampleSel
- ppls
- ebal
- CART
- quint
- EEBoost
- TSDL
This page was built for software: gbm