An information criterion for auxiliary variable selection in incomplete data analysis

From MaRDI portal
Publication:6314404

arXiv1902.07954MaRDI QIDQ6314404FDOQ6314404

Hidetoshi Shimodaira, Shinpei Imori

Publication date: 21 February 2019

Abstract: Statistical inference is considered for variables of interest, called primary variables, when auxiliary variables are observed along with the primary variables. We consider the setting of incomplete data analysis, where some primary variables are not observed. Utilizing a parametric model of joint distribution of primary and auxiliary variables, it is possible to improve the estimation of parametric model for the primary variables when the auxiliary variables are closely related to the primary variables. However, the estimation accuracy reduces when the auxiliary variables are irrelevant to the primary variables. For selecting useful auxiliary variables, we formulate the problem as model selection, and propose an information criterion for predicting primary variables by leveraging auxiliary variables. The proposed information criterion is an asymptotically unbiased estimator of the Kullback-Leibler divergence for complete data of primary variables under some reasonable conditions. We also clarify an asymptotic equivalence between the proposed information criterion and a variant of leave-one-out cross validation. Performance of our method is demonstrated via a simulation study and a real data example.






Cites Work







This page was built for publication: An information criterion for auxiliary variable selection in incomplete data analysis

Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q6314404)