Extending models via gradient boosting: an application to Mendelian models
From MaRDI portal
Publication:2247455
Abstract: Improving existing widely-adopted prediction models is often a more efficient and robust way towards progress than training new models from scratch. Existing models may (a) incorporate complex mechanistic knowledge, (b) leverage proprietary information and, (c) have surmounted barriers to adoption. Compared to model training, model improvement and modification receive little attention. In this paper we propose a general approach to model improvement: we combine gradient boosting with any previously developed model to improve model performance while retaining important existing characteristics. To exemplify, we consider the context of Mendelian models, which estimate the probability of carrying genetic mutations that confer susceptibility to disease by using family pedigrees and health histories of family members. Via simulations we show that integration of gradient boosting with an existing Mendelian model can produce an improved model that outperforms both that model and the model built using gradient boosting alone. We illustrate the approach on genetic testing data from the USC-Stanford Cancer Genetics Hereditary Cancer Panel (HCP) study.
Recommendations
Cites work
- Additive logistic regression: a statistical view of boosting. (With discussion and a rejoinder by the authors)
- An Empirical Distribution Function for Sampling with Incomplete Information
- Bagging predictors
- BayesMendel: an R Environment for Mendelian Risk Prediction
- Greedy function approximation: A gradient boosting machine.
- Maximum Likelihood Estimates of Monotone Parameters
- Stochastic gradient boosting.
This page was built for publication: Extending models via gradient boosting: an application to Mendelian models
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2247455)