Inadmissibility of the corrected Akaike information criterion
From MaRDI portal
Publication:6201858
Abstract: For the multivariate linear regression model with unknown covariance, the corrected Akaike information criterion is the minimum variance unbiased estimator of the expected Kullback--Leibler discrepancy. In this study, based on the loss estimation framework, we show its inadmissibility as an estimator of the Kullback--Leibler discrepancy itself, instead of the expected Kullback--Leibler discrepancy. We provide improved estimators of the Kullback--Leibler discrepancy that work well in reduced-rank situations and examine their performance numerically.
Cites work
- scientific article; zbMATH DE number 4056770 (Why is no real title available?)
- scientific article; zbMATH DE number 18864 (Why is no real title available?)
- scientific article; zbMATH DE number 3557007 (Why is no real title available?)
- scientific article; zbMATH DE number 1964693 (Why is no real title available?)
- scientific article; zbMATH DE number 3444596 (Why is no real title available?)
- A consistency property of the AIC for multivariate linear models when the dimension and the sample size are large
- Akaike's information criterion, \(C_p\) and estimators of loss for elliptically symmetric distributions
- Ancillary Statistics and Estimation of the Loss in Estimation Problems
- Bayesian shrinkage prediction for the regression problem
- Consistency of high-dimensional AIC-type and \(C_p\)-type criteria in multivariate linear regression
- Empirical Bayes on vector observations: An extension of Stein's method
- Estimation of normal means: Frequentist estimation of loss
- Estimation under matrix quadratic loss and matrix superharmonicity
- From Fixed-X to Random-X Regression: Bias-Variance Decompositions, Covariance Penalties, and Prediction Error Estimation
- Further analysis of the data by Akaike's information criterion and the finite corrections
- Goodness of prediction fit
- IMPROVED ESTIMATION OF THE EXPECTED KULLBACK–LEIBLER DISCREPANCY IN CASE OF MISSPECIFICATION
- Improved loss estimation for a normal mean matrix
- Improved loss estimation for the lasso: a variable selection tool
- Information criteria and statistical modeling.
- Information criteria for the predictive evaluation of bayesian models
- Least Squares Model Averaging
- Least squares model averaging by Mallows criterion
- Model Selection and Multimodel Inference
- Model Selection for Multivariate Regression in Small Samples
- Modified AIC and Cp in multivariate linear regression
- Multivariate empirical Bayes and estimation of covariance matrices
- Multivariate reduced-rank regression
- On Bayes and unbiased estimators of loss
- On improved loss estimation for shrinkage estimators
- On unbiased and improved loss estimation for the mean of a multivariate normal distribution with unknown variance.
- Pitman closeness properties of Bayes shrinkage procedures in estimation and prediction
- Regression and time series model selection in small samples
- Second-order Stein: SURE for SURE and other applications in high-dimensional inference
- Shrinkage estimation
- Unifying the derivations for the Akaike and corrected Akaike information criteria.
This page was built for publication: Inadmissibility of the corrected Akaike information criterion
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q6201858)