Pages that link to "Item:Q5295349"
From MaRDI portal
The following pages link to Generalized Additive Modeling with Implicit Variable Selection by Likelihood‐Based Boosting (Q5295349):
Displaying 48 items.
- Boosted multivariate trees for longitudinal data (Q113262) (← links)
- Improved nearest neighbor classifiers by weighting and selection of predictors (Q340856) (← links)
- Boosting algorithms: regularization, prediction and model fitting (Q449780) (← links)
- Additive model selection (Q513754) (← links)
- A boosting method for maximization of the area under the ROC curve (Q645529) (← links)
- On the choice and influence of the number of boosting steps for high-dimensional linear Cox-models (Q722722) (← links)
- Boosting in Cox regression: a comparison between the likelihood-based and the model-based approaches with focus on the R-packages \textit{CoxBoost} and \textit{mboost} (Q736636) (← links)
- Nonparametric estimation of the link function including variable selection (Q746233) (← links)
- Practical variable selection for generalized additive models (Q901636) (← links)
- Boosting additive models using component-wise P-splines (Q961113) (← links)
- Boosting nonlinear additive autoregressive time series (Q961660) (← links)
- Knot selection by boosting techniques (Q1020124) (← links)
- Boosting ridge regression (Q1020707) (← links)
- Selection of components and degrees of smoothing via Lasso in high dimensional nonparametric additive models (Q1023939) (← links)
- Sequential double cross-validation for assessment of added predictive ability in high-dimensional omic applications (Q1621008) (← links)
- Penalized likelihood and Bayesian function selection in regression models (Q1621251) (← links)
- Variable selection in general multinomial logit models (Q1623760) (← links)
- Probing for sparse and fast variable selection with model-based boosting (Q1664500) (← links)
- An update on statistical boosting in biomedicine (Q1664502) (← links)
- Improving the prediction performance of the Lasso by subtracting the additive structural noises (Q1729359) (← links)
- Semiparametric regression during 2003--2007 (Q1952023) (← links)
- A likelihood-based boosting algorithm for factor analysis models with binary data (Q2076167) (← links)
- Inference and computation with generalized additive models and their extensions (Q2195738) (← links)
- A sequential approach to feature selection in high-dimensional additive models (Q2242862) (← links)
- Ridge estimation for multinomial logit models with symmetric side constraints (Q2255916) (← links)
- Variable selection and model choice in structured survival models (Q2255920) (← links)
- Multinomial logit models with implicit variable selection (Q2256779) (← links)
- A penalty approach to differential item functioning in Rasch models (Q2348182) (← links)
- SEMIPARAMETRIC REGRESSION AND GRAPHICAL MODELS (Q2802725) (← links)
- Variable Selection and Model Choice in Geoadditive Regression Models (Q3637018) (← links)
- Detection of differential item functioning in Rasch models by boosting techniques (Q4614710) (← links)
- Subject-specific Bradley–Terry–Luce models with implicit variable selection (Q4971434) (← links)
- Generalized additive models with unknown link function including variable selection (Q5138222) (← links)
- Boosting for statistical modelling-A non-technical introduction (Q5142213) (← links)
- Regularized proportional odds models (Q5220716) (← links)
- An overview of techniques for linking high‐dimensional molecular data to time‐to‐event endpoints by risk prediction models (Q5391149) (← links)
- Stochastic Approximation Boosting for Incomplete Data Problems (Q5850964) (← links)
- Response versus gradient boosting trees, GLMs and neural networks under Tweedie loss and log-link (Q5872567) (← links)
- Feature selection in ultrahigh-dimensional additive models with heterogeneous frequency component functions (Q6101702) (← links)
- Prediction of sports injuries in football: a recurrent time-to-event approach using regularized Cox models (Q6107409) (← links)
- Introducing Lasso-type penalisation to generalised joint regression modelling for count data (Q6107410) (← links)
- De-noising boosting methods for variable selection and estimation subject to error-prone variables (Q6171767) (← links)
- Feature selection algorithms in generalized additive models under concurvity (Q6567405) (← links)
- Bayesian learners in gradient boosting for linear mixed models (Q6590279) (← links)
- Boosting multivariate structured additive distributional regression models (Q6617538) (← links)
- Gradient boosting for linear mixed models (Q6636028) (← links)
- Super learner for survival data prediction (Q6636051) (← links)
- Significance tests for boosted location and scale models with linear base-learners (Q6637197) (← links)