Cross validation model selection criteria for linear regression based on the Kullback-Leibler discrepancy
From MaRDI portal
Publication:713660
DOI10.1016/J.STAMET.2005.05.002zbMATH Open1248.62110OpenAlexW2093725103MaRDI QIDQ713660FDOQ713660
Authors: Simon L. Davies, A. A. Neath, Joseph E. Cavanaugh
Publication date: 19 October 2012
Published in: Statistical Methodology (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1016/j.stamet.2005.05.002
Recommendations
- Criteria for Linear Model Selection Based on Kullback's Symmetric Divergence
- Model selection criteria based on Kullback information measures for nonlinear regression
- Estimation in a linear regression model under the Kullback-Leibler loss and its application to model selection
- A class of cross-validatory model selection criteria
- A note on the generalised cross-validation criterion in linear model selection
Cites Work
- Regression and time series model selection in small samples
- Title not available (Why is that?)
- A new look at the statistical model identification
- Robust Statistics
- Title not available (Why is that?)
- Linear Model Selection by Cross-Validation
- Title not available (Why is that?)
- How Biased is the Apparent Error Rate of a Prediction Rule?
- Further analysis of the data by Akaike's information criterion and the finite corrections
- The Relationship between Variable Selection and Data Agumentation and a Method for Prediction
- Estimating the Error Rate of a Prediction Rule: Improvement on Cross-Validation
- Title not available (Why is that?)
- Unifying the derivations for the Akaike and corrected Akaike information criteria.
- Approximations for the Psi (Digamma) Function
Cited In (15)
- Corrected versions of cross-validation criteria for selecting multivariate regression and growth curve models
- Estimation in a linear regression model under the Kullback-Leibler loss and its application to model selection
- An alternate version of the conceptual predictive statistic based on a symmetrized discrepancy measure
- An alternate approach to pseudo-likelihood model selection in the generalized linear mixed modeling framework
- Model selection criteria based on cross-validatory concordance statistics
- A note on the generalised cross-validation criterion in linear model selection
- Model selection criteria based on computationally intensive estimators of the expected optimism
- Criteria for Linear Model Selection Based on Kullback's Symmetric Divergence
- The connection between cross-validation and Akaike information criterion in a semiparametric family
- A survey of cross-validation procedures for model selection
- A class of cross-validatory model selection criteria
- Selection criteria based on Monte Carlo simulation and cross validation in mixed models
- Bias correction of cross-validation criterion based on Kullback-Leibler information under a general condition
- Asymptotic biases of information and cross-validation criteria under canonical parametrization
- Bootstrap-based model selection criteria for beta regressions
This page was built for publication: Cross validation model selection criteria for linear regression based on the Kullback-Leibler discrepancy
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q713660)