Optimal information criteria minimizing their asymptotic mean square errors
From MaRDI portal
(Redirected from Publication:506001)
Recommendations
- Asymptotic cumulants of some information criteria
- Improvement to AIC as Estimator of Kullback–Leibler Information for Linear Model Selection
- scientific article; zbMATH DE number 32251
- Information criteria and statistical modeling.
- An iterative approach to variable selection based on the kullback-leibler information
Cites work
- scientific article; zbMATH DE number 1239310 (Why is no real title available?)
- scientific article; zbMATH DE number 1034043 (Why is no real title available?)
- scientific article; zbMATH DE number 1034048 (Why is no real title available?)
- scientific article; zbMATH DE number 3444596 (Why is no real title available?)
- Asymptotic cumulants of the estimator of the canonical parameter in the exponential family
- Asymptotic expansions for the pivots using log-likelihood derivatives with an application in item response theory
- Asymptotic theory for information criteria in model selection -- functional approach
- Bias adjustment minimizing the asymptotic mean square error
- Bootstrapping log likelihood and EIC, an extension of AIC
- Estimating the dimension of a model
- Further analysis of the data by Akaike's information criterion and the finite corrections
- Generalised information criteria in model selection
- Information criteria and statistical modeling.
- Modified AIC and Cp in multivariate linear regression
- On Information and Sufficiency
- Regression and time series model selection in small samples
- Ridge Regression: Biased Estimation for Nonorthogonal Problems
- Some Comments on C P
Cited in
(4)
This page was built for publication: Optimal information criteria minimizing their asymptotic mean square errors
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q506001)