Importance sampling schemes for evidence approximation in mixture models

From MaRDI portal
Publication:516488

DOI10.1214/15-BA970zbMATH Open1357.62116arXiv1311.6000MaRDI QIDQ516488FDOQ516488

Sumit K. Garg, Yong-Cai Geng

Publication date: 14 March 2017

Published in: Bayesian Analysis (Search for Journal in Brave)

Abstract: The marginal likelihood is a central tool for drawing Bayesian inference about the number of components in mixture models. It is often approximated since the exact form is unavailable. A bias in the approximation may be due to an incomplete exploration by a simulated Markov chain (e.g., a Gibbs sequence) of the collection of posterior modes, a phenomenon also known as lack of label switching, as all possible label permutations must be simulated by a chain in order to converge and hence overcome the bias. In an importance sampling approach, imposing label switching to the importance function results in an exponential increase of the computational cost with the number of components. In this paper, two importance sampling schemes are proposed through choices for the importance function; a MLE proposal and a Rao-Blackwellised importance function. The second scheme is called dual importance sampling. We demonstrate that this dual importance sampling is a valid estimator of the evidence and moreover show that the statistical efficiency of estimates increases. To reduce the induced high demand in computation, the original importance function is approximated but a suitable approximation can produce an estimate with the same precision and with reduced computational workload.


Full work available at URL: https://arxiv.org/abs/1311.6000




Recommendations




Cites Work


Cited In (9)

Uses Software





This page was built for publication: Importance sampling schemes for evidence approximation in mixture models

Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q516488)