An iterative approach to variable selection based on the kullback-leibler information
From MaRDI portal
Publication:4241673
DOI10.1080/03610929908832342zbMATH Open1052.62502OpenAlexW2079399792MaRDI QIDQ4241673FDOQ4241673
Authors: Anthony W. Hughes, Maxwell L. King
Publication date: 14 June 1999
Published in: Communications in Statistics: Theory and Methods (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1080/03610929908832342
Recommendations
- scientific article; zbMATH DE number 2020784
- The model selection criterion AICu.
- An improved Akaike information criterion for state-space model selection
- Improvement to AIC as Estimator of Kullback–Leibler Information for Linear Model Selection
- Model selection criteria based on Kullback information measures for nonlinear regression
Cites Work
Cited In (10)
- A New Approach of Information Discrepancy to Analysis of Questionnaire Data
- Evaluation of the Kullback‐Leibler Discrepancy for Model Selection in Open Population Capture‐Recapture Models
- Estimation in a linear regression model under the Kullback-Leibler loss and its application to model selection
- Model selection using AIC in the presence of one-sided information
- A multistage algorithm for best-subset model selection based on the Kullback-Leibler discrepancy
- A note on overfitting properties of KIC and KIC\(_{c}\)
- Improvement to AIC as Estimator of Kullback–Leibler Information for Linear Model Selection
- Optimal information criteria minimizing their asymptotic mean square errors
- Title not available (Why is that?)
- IMPROVED ESTIMATION OF THE EXPECTED KULLBACK–LEIBLER DISCREPANCY IN CASE OF MISSPECIFICATION
This page was built for publication: An iterative approach to variable selection based on the kullback-leibler information
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q4241673)