Generalized Additive Modeling with Implicit Variable Selection by Likelihood‐Based Boosting
From MaRDI portal
Publication:5295349
DOI10.1111/j.1541-0420.2006.00578.xzbMath1116.62075OpenAlexW2025266808WikidataQ51909164 ScholiaQ51909164MaRDI QIDQ5295349
Publication date: 27 July 2007
Published in: Biometrics (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1111/j.1541-0420.2006.00578.x
Nonparametric regression and quantile regression (62G08) Generalized linear models (logistic models) (62J12)
Related Items (43)
Sequential double cross-validation for assessment of added predictive ability in high-dimensional omic applications ⋮ Penalized likelihood and Bayesian function selection in regression models ⋮ Variable selection in general multinomial logit models ⋮ De-noising boosting methods for variable selection and estimation subject to error-prone variables ⋮ Improved nearest neighbor classifiers by weighting and selection of predictors ⋮ Probing for sparse and fast variable selection with model-based boosting ⋮ An update on statistical boosting in biomedicine ⋮ Feature selection in ultrahigh-dimensional additive models with heterogeneous frequency component functions ⋮ Prediction of sports injuries in football: a recurrent time-to-event approach using regularized Cox models ⋮ Introducing Lasso-type penalisation to generalised joint regression modelling for count data ⋮ Inference and computation with generalized additive models and their extensions ⋮ Response versus gradient boosting trees, GLMs and neural networks under Tweedie loss and log-link ⋮ Large Scale Prediction with Decision Trees ⋮ Practical variable selection for generalized additive models ⋮ Semiparametric regression during 2003--2007 ⋮ Boosting algorithms: regularization, prediction and model fitting ⋮ A boosting method for maximization of the area under the ROC curve ⋮ Generalized additive models with unknown link function including variable selection ⋮ Improving the prediction performance of the Lasso by subtracting the additive structural noises ⋮ Boosting for statistical modelling-A non-technical introduction ⋮ An overview of techniques for linking high‐dimensional molecular data to time‐to‐event endpoints by risk prediction models ⋮ A sequential approach to feature selection in high-dimensional additive models ⋮ Additive model selection ⋮ Boosting additive models using component-wise P-splines ⋮ Boosting nonlinear additive autoregressive time series ⋮ Ridge estimation for multinomial logit models with symmetric side constraints ⋮ Variable selection and model choice in structured survival models ⋮ Multinomial logit models with implicit variable selection ⋮ Boosted multivariate trees for longitudinal data ⋮ On the choice and influence of the number of boosting steps for high-dimensional linear Cox-models ⋮ SEMIPARAMETRIC REGRESSION AND GRAPHICAL MODELS ⋮ Boosting in Cox regression: a comparison between the likelihood-based and the model-based approaches with focus on the R-packages \textit{CoxBoost} and \textit{mboost} ⋮ Nonparametric estimation of the link function including variable selection ⋮ Subject-specific Bradley–Terry–Luce models with implicit variable selection ⋮ Regularized proportional odds models ⋮ Knot selection by boosting techniques ⋮ Boosting ridge regression ⋮ Variable Selection and Model Choice in Geoadditive Regression Models ⋮ Selection of components and degrees of smoothing via Lasso in high dimensional nonparametric additive models ⋮ A likelihood-based boosting algorithm for factor analysis models with binary data ⋮ Detection of differential item functioning in Rasch models by boosting techniques ⋮ Stochastic Approximation Boosting for Incomplete Data Problems ⋮ A penalty approach to differential item functioning in Rasch models
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- gss
- Direct generalized additive modeling with penalized likelihood.
- A comparison of regression spline smoothing procedures
- Flexible smoothing with \(B\)-splines and penalties. With comments and a rejoinder by the authors
- Additive logistic regression: a statistical view of boosting. (With discussion and a rejoinder by the authors)
- Minimizing GCV/GML Scores with Multiple Smoothing Parameters via the Newton Method
- Boosting With theL2Loss
- A kernel method of estimating structured nonparametric regression based on marginal integration
- Stable and Efficient Multiple Smoothing Parameter Estimation for Generalized Additive Models
- The elements of statistical learning. Data mining, inference, and prediction
- Smoothing spline ANOVA models
This page was built for publication: Generalized Additive Modeling with Implicit Variable Selection by Likelihood‐Based Boosting