Extending models via gradient boosting: an application to Mendelian models

From MaRDI portal
Publication:2247455

DOI10.1214/21-AOAS1482zbMATH Open1478.62329arXiv2105.06559OpenAlexW3203065292MaRDI QIDQ2247455FDOQ2247455


Authors: Theodore Huang, Gregory Idos, Christine Hong, Stephen B. Gruber, Giovanni Parmigiani, Danielle Braun Edit this on Wikidata


Publication date: 17 November 2021

Published in: The Annals of Applied Statistics (Search for Journal in Brave)

Abstract: Improving existing widely-adopted prediction models is often a more efficient and robust way towards progress than training new models from scratch. Existing models may (a) incorporate complex mechanistic knowledge, (b) leverage proprietary information and, (c) have surmounted barriers to adoption. Compared to model training, model improvement and modification receive little attention. In this paper we propose a general approach to model improvement: we combine gradient boosting with any previously developed model to improve model performance while retaining important existing characteristics. To exemplify, we consider the context of Mendelian models, which estimate the probability of carrying genetic mutations that confer susceptibility to disease by using family pedigrees and health histories of family members. Via simulations we show that integration of gradient boosting with an existing Mendelian model can produce an improved model that outperforms both that model and the model built using gradient boosting alone. We illustrate the approach on genetic testing data from the USC-Stanford Cancer Genetics Hereditary Cancer Panel (HCP) study.


Full work available at URL: https://arxiv.org/abs/2105.06559




Recommendations




Cites Work


Cited In (1)

Uses Software





This page was built for publication: Extending models via gradient boosting: an application to Mendelian models

Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2247455)