Cross validation model selection criteria for linear regression based on the Kullback-Leibler discrepancy
From MaRDI portal
Publication:713660
DOI10.1016/j.stamet.2005.05.002zbMath1248.62110OpenAlexW2093725103MaRDI QIDQ713660
Joseph E. Cavanaugh, Simon L. Davies, Andrew A. Neath
Publication date: 19 October 2012
Published in: Statistical Methodology (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1016/j.stamet.2005.05.002
Related Items
The connection between cross-validation and Akaike information criterion in a semiparametric family ⋮ Model selection criteria based on cross-validatory concordance statistics ⋮ Asymptotic biases of information and cross-validation criteria under canonical parametrization ⋮ Bootstrap-based model selection criteria for beta regressions ⋮ A survey of cross-validation procedures for model selection ⋮ An alternate approach to pseudo-likelihood model selection in the generalized linear mixed modeling framework
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unifying the derivations for the Akaike and corrected Akaike information criteria.
- Estimating the Error Rate of a Prediction Rule: Improvement on Cross-Validation
- How Biased is the Apparent Error Rate of a Prediction Rule?
- Regression and time series model selection in small samples
- Further analysis of the data by Akaike's information criterion and the finite corrections
- The Relationship between Variable Selection and Data Agumentation and a Method for Prediction
- Linear Model Selection by Cross-Validation
- Approximations for the Psi (Digamma) Function
- Robust Statistics
- A new look at the statistical model identification