Importance sampling schemes for evidence approximation in mixture models
From MaRDI portal
Abstract: The marginal likelihood is a central tool for drawing Bayesian inference about the number of components in mixture models. It is often approximated since the exact form is unavailable. A bias in the approximation may be due to an incomplete exploration by a simulated Markov chain (e.g., a Gibbs sequence) of the collection of posterior modes, a phenomenon also known as lack of label switching, as all possible label permutations must be simulated by a chain in order to converge and hence overcome the bias. In an importance sampling approach, imposing label switching to the importance function results in an exponential increase of the computational cost with the number of components. In this paper, two importance sampling schemes are proposed through choices for the importance function; a MLE proposal and a Rao-Blackwellised importance function. The second scheme is called dual importance sampling. We demonstrate that this dual importance sampling is a valid estimator of the evidence and moreover show that the statistical efficiency of estimates increases. To reduce the induced high demand in computation, the original importance function is approximated but a suitable approximation can produce an estimate with the same precision and with reduced computational workload.
Recommendations
- Keeping the balance -- bridge sampling for marginal likelihood estimation in finite mixture, mixture of experts and Markov mixture models
- Estimating marginal likelihoods for mixture and Markov switching models using bridge sampling techniques*
- On the use of marginal posteriors in marginal likelihood estimation via importance sampling
- Mixture models, latent variables and partitioned importance sampling
- Adaptive mixture importance sampling
Cites work
- scientific article; zbMATH DE number 6114089 (Why is no real title available?)
- scientific article; zbMATH DE number 6114093 (Why is no real title available?)
- scientific article; zbMATH DE number 5133222 (Why is no real title available?)
- scientific article; zbMATH DE number 509150 (Why is no real title available?)
- scientific article; zbMATH DE number 597901 (Why is no real title available?)
- scientific article; zbMATH DE number 1085980 (Why is no real title available?)
- scientific article; zbMATH DE number 1932865 (Why is no real title available?)
- scientific article; zbMATH DE number 947416 (Why is no real title available?)
- scientific article; zbMATH DE number 2104218 (Why is no real title available?)
- scientific article; zbMATH DE number 2117879 (Why is no real title available?)
- scientific article; zbMATH DE number 795289 (Why is no real title available?)
- scientific article; zbMATH DE number 3037359 (Why is no real title available?)
- A comparative study of Monte Carlo methods for efficient evaluation of marginal likelihood
- A sequential particle filter method for static models
- Accurate Approximations for Posterior Moments and Marginal Densities
- Bayesian Density Estimation and Inference Using Mixtures
- Bayesian Methods for Hidden Markov Models
- Bayesian analysis of mixture models with an unknown number of components\,--\,an alternative to reversible jump methods.
- Bayesian model choice based on Monte Carlo estimates of posterior model probabilities
- Calculating posterior distributions and modal estimates in Markov mixture models
- Computational and Inferential Difficulties with Mixture Posterior Distributions
- Computing Bayes Factors Using a Generalization of the Savage-Dickey Density Ratio
- Computing Bayes Factors by Combining Simulation and Asymptotic Approximations
- Dealing With Label Switching in Mixture Models
- Estimating marginal likelihoods for mixture and Markov switching models using bridge sampling techniques*
- Finite mixture and Markov switching models.
- Interpretation and inference in mixture models: simple MCMC works
- Marginal Likelihood Estimation via Power Posteriors
- Marginal Likelihood from the Gibbs Output
- Markov chain Monte Carlo Estimation of Classical and Dynamic Switching and Mixture Models
- Markov chain Monte Carlo methods and the label switching problem in Bayesian mixture modeling
- Mixtures of Dirichlet processes with applications to Bayesian nonparametric problems
- Monte Carlo methods in Bayesian computation
- On resolving the Savage-Dickey paradox
- On some difficulties with a posterior probability approximation technique
- Properties of nested sampling
- Retrospective Markov chain Monte Carlo methods for Dirichlet process hierarchical models
- Reversible jump Markov chain Monte Carlo computation and Bayesian model determination
- Sampling-Based Approaches to Calculating Marginal Densities
- Simulating normalizing constants: From importance sampling to bridge sampling to path sampling
- The Calculation of Posterior Distributions by Data Augmentation
Cited in
(9)- Mixture models, latent variables and partitioned importance sampling
- Convergence of adaptive mixtures of importance sampling schemes
- Coupling the reduced-order model and the generative model for an importance sampling estimator
- Conditional reliability analysis in high dimensions based on controlled mixture importance sampling and information reuse
- Langevin incremental mixture importance sampling
- Importance-Weighted Marginal Bayesian Posterior Density Estimation
- Keeping the balance -- bridge sampling for marginal likelihood estimation in finite mixture, mixture of experts and Markov mixture models
- Estimation and approximation of densities of i.i.d. sums via importance sampling.
- Efficient importance sampling in mixture frameworks
This page was built for publication: Importance sampling schemes for evidence approximation in mixture models
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q516488)