Improved minimax predictive densities under Kullback-Leibler loss

From MaRDI portal
Publication:2493546

DOI10.1214/009053606000000155zbMATH Open1091.62003arXivmath/0605432OpenAlexW2056860084MaRDI QIDQ2493546FDOQ2493546


Authors: Edward I. George, Feng Liang, Xinyi Xu Edit this on Wikidata


Publication date: 21 June 2006

Published in: The Annals of Statistics (Search for Journal in Brave)

Abstract: Let X|musimNp(mu,vxI) and Y|musimNp(mu,vyI) be independent p-dimensional multivariate normal vectors with common unknown mean mu. Based on only observing X=x, we consider the problem of obtaining a predictive density hatp(y|x) for Y that is close to p(y|mu) as measured by expected Kullback--Leibler loss. A natural procedure for this problem is the (formal) Bayes predictive density hatpmathrmU(y|x) under the uniform prior pimathrmU(mu)equiv1, which is best invariant and minimax. We show that any Bayes predictive density will be minimax if it is obtained by a prior yielding a marginal that is superharmonic or whose square root is superharmonic. This yields wide classes of minimax procedures that dominate hatpmathrmU(y|x), including Bayes predictive densities under superharmonic priors. Fundamental similarities and differences with the parallel theory of estimating a multivariate normal mean under quadratic loss are described.


Full work available at URL: https://arxiv.org/abs/math/0605432




Recommendations




Cites Work


Cited In (59)





This page was built for publication: Improved minimax predictive densities under Kullback-Leibler loss

Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2493546)