Efficient approximate k‐fold and leave‐one‐out cross‐validation for ridge regression
From MaRDI portal
Publication:4917509
DOI10.1002/bimj.201200088zbMath1441.62437WikidataQ85954305 ScholiaQ85954305MaRDI QIDQ4917509
Jelle J. Goeman, Rosa J. Meijer
Publication date: 30 April 2013
Published in: Biometrical Journal (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1002/bimj.201200088
62J07: Ridge regression; shrinkage estimators (Lasso)
62P10: Applications of statistics to biology and medical sciences; meta analysis
Related Items
Graphical group ridge, Fast Cross-validation for Multi-penalty High-dimensional Ridge Regression, A Cross-Validation Statistical Framework for Asymmetric Data Integration, Inference for non-probability samples under high-dimensional covariate-adjusted superpopulation model
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Fast exact leave-one-out cross-validation of sparse least-squares support vector machines
- Survival prediction using gene expression data: a review and comparison
- A survey of cross-validation procedures for model selection
- Model selection via multifold cross validation
- Shrinkage Tuning Parameter Selection with a Diverging number of Parameters
- Penalized Partial Likelihood Regression for Right-Censored Data with Bootstrap Selection of the Penalty Parameter
- Penalized Maximum Likelihood Principle for Choosing Ridge Parameter
- Updating the Inverse of a Matrix
- Generalized Cross-Validation as a Method for Choosing a Good Ridge Parameter
- Ridge Estimators in Logistic Regression
- Adjustment of an Inverse Matrix Corresponding to a Change in One Element of a Given Matrix
- An Inverse Matrix Adjustment Arising in Discriminant Analysis