Model selection criteria based on Kullback information measures for nonlinear regression
From MaRDI portal
Publication:2386146
DOI10.1016/j.jspi.2004.05.002zbMath1140.62331OpenAlexW2002890723MaRDI QIDQ2386146
Hyun-Joo Kim, Joseph E. Cavanaugh
Publication date: 22 August 2005
Published in: Journal of Statistical Planning and Inference (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1016/j.jspi.2004.05.002
Akaike information criterionI-divergenceKullback-Leibler informationNonlinear regressionAICJ-divergence
Related Items (4)
Some selection criteria for nested binary choice models: a comparative study ⋮ An alternate version of the conceptual predictive statistic based on a symmetrized discrepancy measure ⋮ A cluster tree based model selection approach for logistic regression classifier ⋮ Model selection and mixed-effects modeling of HIV infection dynamics
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unifying the derivations for the Akaike and corrected Akaike information criteria.
- A large-sample model selection criterion based on Kullback's symmetric divergence
- Regression and time series model selection in small samples
- Further analysis of the data by Akaike's information criterion and the finite corrections
- Criteria for Linear Model Selection Based on Kullback's Symmetric Divergence
- On Information and Sufficiency
- A new look at the statistical model identification
This page was built for publication: Model selection criteria based on Kullback information measures for nonlinear regression