A family of the information criteria using the phi-divergence for categorical data
From MaRDI portal
Publication:1662860
DOI10.1016/j.csda.2018.03.001zbMath1469.62122OpenAlexW2792882170MaRDI QIDQ1662860
Publication date: 20 August 2018
Published in: Computational Statistics and Data Analysis (Search for Journal in Brave)
Full work available at URL: http://hdl.handle.net/10252/00005700
Computational methods for problems pertaining to statistics (62-08) Statistical aspects of information-theoretic topics (62B10)
Related Items (1)
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Bias correction of the Akaike information criterion in factor analysis
- A class of cross-validatory model selection criteria
- Expected predictive least squares for model selection in covariance structures
- Asymptotic cumulants of the parameters estimators in item response theory
- A differential geometric approach to statistical inference on the basis of contrast functionals
- Minimum \(\phi\)-divergence estimator in logistic regression models
- Minimum \(\phi\)-divergence estimation in constrained latent class models for binary data
- Minimum phi-divergence estimators for loglinear models with linear constraints and multinomial sampling
- Factor analysis and AIC
- Minimum chi-square, not maximum likelihood!
- Goodness-of-fit statistics for discrete multivariate data
- Estimating the dimension of a model
- Model checking in loglinear models using \(\phi\)-divergences and MLEs
- Asymptotic divergence of estimates of discrete distributions
- On the estimation by the minimum distance method
- GLS Discrepancy Based Information Criteria for Selecting Covariance Structure Models
- Regression and time series model selection in small samples
- Further analysis of the data by Akaike's information criterion and the finite corrections
- Modified AIC and Cp in multivariate linear regression
- Minimum power-divergence estimator in three-way contingency tables
- Mean Square Error of Prediction as a Criterion for Selecting Variables
- Some Comments on C P
- Transformations Related to the Angular and the Square Root
- On Information and Sufficiency
This page was built for publication: A family of the information criteria using the phi-divergence for categorical data