Importance sampling schemes for evidence approximation in mixture models
From MaRDI portal
Publication:516488
DOI10.1214/15-BA970zbMATH Open1357.62116arXiv1311.6000MaRDI QIDQ516488FDOQ516488
Publication date: 14 March 2017
Published in: Bayesian Analysis (Search for Journal in Brave)
Abstract: The marginal likelihood is a central tool for drawing Bayesian inference about the number of components in mixture models. It is often approximated since the exact form is unavailable. A bias in the approximation may be due to an incomplete exploration by a simulated Markov chain (e.g., a Gibbs sequence) of the collection of posterior modes, a phenomenon also known as lack of label switching, as all possible label permutations must be simulated by a chain in order to converge and hence overcome the bias. In an importance sampling approach, imposing label switching to the importance function results in an exponential increase of the computational cost with the number of components. In this paper, two importance sampling schemes are proposed through choices for the importance function; a MLE proposal and a Rao-Blackwellised importance function. The second scheme is called dual importance sampling. We demonstrate that this dual importance sampling is a valid estimator of the evidence and moreover show that the statistical efficiency of estimates increases. To reduce the induced high demand in computation, the original importance function is approximated but a suitable approximation can produce an estimate with the same precision and with reduced computational workload.
Full work available at URL: https://arxiv.org/abs/1311.6000
Recommendations
- Keeping the balance -- bridge sampling for marginal likelihood estimation in finite mixture, mixture of experts and Markov mixture models
- Estimating marginal likelihoods for mixture and Markov switching models using bridge sampling techniques*
- On the use of marginal posteriors in marginal likelihood estimation via importance sampling
- Mixture models, latent variables and partitioned importance sampling
- Adaptive mixture importance sampling
Bayesian inference (62F15) Classification and discrimination; cluster analysis (statistical aspects) (62H30)
Cites Work
- Bayesian Density Estimation and Inference Using Mixtures
- Interpretation and inference in mixture models: simple MCMC works
- Reversible jump Markov chain Monte Carlo computation and Bayesian model determination
- Markov chain Monte Carlo methods and the label switching problem in Bayesian mixture modeling
- Finite mixture and Markov switching models.
- Estimating marginal likelihoods for mixture and Markov switching models using bridge sampling techniques*
- The Calculation of Posterior Distributions by Data Augmentation
- Title not available (Why is that?)
- Title not available (Why is that?)
- Dealing With Label Switching in Mixture Models
- Computational and Inferential Difficulties with Mixture Posterior Distributions
- Markov chain Monte Carlo Estimation of Classical and Dynamic Switching and Mixture Models
- Title not available (Why is that?)
- Calculating posterior distributions and modal estimates in Markov mixture models
- Mixtures of Dirichlet processes with applications to Bayesian nonparametric problems
- Monte Carlo methods in Bayesian computation
- Marginal Likelihood from the Gibbs Output
- Title not available (Why is that?)
- Sampling-Based Approaches to Calculating Marginal Densities
- Accurate Approximations for Posterior Moments and Marginal Densities
- Title not available (Why is that?)
- Computing Bayes Factors Using a Generalization of the Savage-Dickey Density Ratio
- Title not available (Why is that?)
- A sequential particle filter method for static models
- Bayesian analysis of mixture models with an unknown number of components\,--\,an alternative to reversible jump methods.
- Title not available (Why is that?)
- Title not available (Why is that?)
- Simulating normalizing constants: From importance sampling to bridge sampling to path sampling
- Retrospective Markov chain Monte Carlo methods for Dirichlet process hierarchical models
- On resolving the Savage-Dickey paradox
- Properties of nested sampling
- Marginal Likelihood Estimation via Power Posteriors
- Computing Bayes Factors by Combining Simulation and Asymptotic Approximations
- Bayesian Methods for Hidden Markov Models
- Title not available (Why is that?)
- Bayesian model choice based on Monte Carlo estimates of posterior model probabilities
- Title not available (Why is that?)
- Title not available (Why is that?)
- A comparative study of Monte Carlo methods for efficient evaluation of marginal likelihood
- On some difficulties with a posterior probability approximation technique
- Title not available (Why is that?)
Cited In (9)
- Coupling the reduced-order model and the generative model for an importance sampling estimator
- Conditional reliability analysis in high dimensions based on controlled mixture importance sampling and information reuse
- Estimation and approximation of densities of i.i.d. sums via importance sampling.
- Convergence of adaptive mixtures of importance sampling schemes
- Importance-Weighted Marginal Bayesian Posterior Density Estimation
- Efficient importance sampling in mixture frameworks
- Mixture models, latent variables and partitioned importance sampling
- Keeping the balance -- bridge sampling for marginal likelihood estimation in finite mixture, mixture of experts and Markov mixture models
- Langevin incremental mixture importance sampling
Uses Software
This page was built for publication: Importance sampling schemes for evidence approximation in mixture models
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q516488)