Inadmissibility of the corrected Akaike information criterion
From MaRDI portal
Publication:6201858
DOI10.3150/23-BEJ1638arXiv2211.09326OpenAlexW4391458284MaRDI QIDQ6201858FDOQ6201858
Authors: T. Matsuda
Publication date: 26 March 2024
Published in: Bernoulli (Search for Journal in Brave)
Abstract: For the multivariate linear regression model with unknown covariance, the corrected Akaike information criterion is the minimum variance unbiased estimator of the expected Kullback--Leibler discrepancy. In this study, based on the loss estimation framework, we show its inadmissibility as an estimator of the Kullback--Leibler discrepancy itself, instead of the expected Kullback--Leibler discrepancy. We provide improved estimators of the Kullback--Leibler discrepancy that work well in reduced-rank situations and examine their performance numerically.
Full work available at URL: https://arxiv.org/abs/2211.09326
Akaike information criterionadmissibilityloss estimationKullback-Leibler discrepancycorrected Akaike information criterion
Cites Work
- Regression and time series model selection in small samples
- Title not available (Why is that?)
- Title not available (Why is that?)
- Multivariate reduced-rank regression
- Model Selection and Multimodel Inference
- Further analysis of the data by Akaike's information criterion and the finite corrections
- Bayesian shrinkage prediction for the regression problem
- Information criteria and statistical modeling.
- Multivariate empirical Bayes and estimation of covariance matrices
- On Bayes and unbiased estimators of loss
- Title not available (Why is that?)
- Goodness of prediction fit
- Model Selection for Multivariate Regression in Small Samples
- Least squares model averaging by Mallows criterion
- Least Squares Model Averaging
- Empirical Bayes on vector observations: An extension of Stein's method
- Consistency of high-dimensional AIC-type and \(C_p\)-type criteria in multivariate linear regression
- Modified AIC and Cp in multivariate linear regression
- Pitman closeness properties of Bayes shrinkage procedures in estimation and prediction
- Title not available (Why is that?)
- Unifying the derivations for the Akaike and corrected Akaike information criteria.
- A consistency property of the AIC for multivariate linear models when the dimension and the sample size are large
- IMPROVED ESTIMATION OF THE EXPECTED KULLBACK–LEIBLER DISCREPANCY IN CASE OF MISSPECIFICATION
- On unbiased and improved loss estimation for the mean of a multivariate normal distribution with unknown variance.
- Title not available (Why is that?)
- Ancillary Statistics and Estimation of the Loss in Estimation Problems
- Estimation of normal means: Frequentist estimation of loss
- Estimation under matrix quadratic loss and matrix superharmonicity
- On improved loss estimation for shrinkage estimators
- Information criteria for the predictive evaluation of bayesian models
- Improved loss estimation for a normal mean matrix
- Second-order Stein: SURE for SURE and other applications in high-dimensional inference
- Improved loss estimation for the lasso: a variable selection tool
- Akaike's information criterion, \(C_p\) and estimators of loss for elliptically symmetric distributions
- Shrinkage estimation
- From Fixed-X to Random-X Regression: Bias-Variance Decompositions, Covariance Penalties, and Prediction Error Estimation
This page was built for publication: Inadmissibility of the corrected Akaike information criterion
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q6201858)