Cross validation model selection criteria for linear regression based on the Kullback-Leibler discrepancy
From MaRDI portal
(Redirected from Publication:713660)
Recommendations
- Criteria for Linear Model Selection Based on Kullback's Symmetric Divergence
- Model selection criteria based on Kullback information measures for nonlinear regression
- Estimation in a linear regression model under the Kullback-Leibler loss and its application to model selection
- A class of cross-validatory model selection criteria
- A note on the generalised cross-validation criterion in linear model selection
Cites work
- scientific article; zbMATH DE number 4088698 (Why is no real title available?)
- scientific article; zbMATH DE number 486467 (Why is no real title available?)
- scientific article; zbMATH DE number 3444596 (Why is no real title available?)
- scientific article; zbMATH DE number 3241743 (Why is no real title available?)
- A new look at the statistical model identification
- Approximations for the Psi (Digamma) Function
- Estimating the Error Rate of a Prediction Rule: Improvement on Cross-Validation
- Further analysis of the data by Akaike's information criterion and the finite corrections
- How Biased is the Apparent Error Rate of a Prediction Rule?
- Linear Model Selection by Cross-Validation
- Regression and time series model selection in small samples
- Robust Statistics
- The Relationship between Variable Selection and Data Agumentation and a Method for Prediction
- Unifying the derivations for the Akaike and corrected Akaike information criteria.
Cited in
(16)- A note on the generalised cross-validation criterion in linear model selection
- Selection criteria based on Monte Carlo simulation and cross validation in mixed models
- Estimation in a linear regression model under the Kullback-Leibler loss and its application to model selection
- A survey of cross-validation procedures for model selection
- Corrected versions of cross-validation criteria for selecting multivariate regression and growth curve models
- A class of cross-validatory model selection criteria
- Asymptotic biases of information and cross-validation criteria under canonical parametrization
- An alternate approach to pseudo-likelihood model selection in the generalized linear mixed modeling framework
- Bootstrap-based model selection criteria for beta regressions
- The connection between cross-validation and Akaike information criterion in a semiparametric family
- Estimating the Kullback–Liebler risk based on multifold cross‐validation
- Model selection criteria based on computationally intensive estimators of the expected optimism
- Bias correction of cross-validation criterion based on Kullback-Leibler information under a general condition
- Model selection criteria based on cross-validatory concordance statistics
- Criteria for Linear Model Selection Based on Kullback's Symmetric Divergence
- An alternate version of the conceptual predictive statistic based on a symmetrized discrepancy measure
This page was built for publication: Cross validation model selection criteria for linear regression based on the Kullback-Leibler discrepancy
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q713660)