Generalized mixtures of finite mixtures and telescoping sampling
From MaRDI portal
(Redirected from Publication:130362)
Abstract: Within a Bayesian framework, a comprehensive investigation of mixtures of finite mixtures (MFMs), i.e., finite mixtures with a prior on the number of components, is performed. This model class has applications in model-based clustering as well as for semi-parametric density estimation and requires suitable prior specifications and inference methods to exploit its full potential. We contribute by considering a generalized class of MFMs where the hyperparameter of a symmetric Dirichlet prior on the weight distribution depends on the number of components. We show that this model class may be regarded as a Bayesian non-parametric mixture outside the class of Gibbs-type priors. We emphasize the distinction between the number of components of a mixture and the number of clusters , i.e., the number of filled components given the data. In the MFM model, is a random variable and its prior depends on the prior on and on the hyperparameter . We employ a flexible prior distribution for the number of components and derive the corresponding prior on the number of clusters for generalized MFMs. For posterior inference, we propose the novel telescoping sampler which allows Bayesian inference for mixtures with arbitrary component distributions without resorting to reversible jump Markov chain Monte Carlo (MCMC) methods. The telescoping sampler explicitly samples the number of components, but otherwise requires only the usual MCMC steps of a finite mixture model. The ease of its application using different component distributions is demonstrated on several data sets.
Cites work
- A Dirichlet process mixture model for the analysis of correlated binary responses
- A species sampling model with finitely many types
- An asymptotic analysis of a class of discrete nonparametric priors
- Asymptotic behaviour of the posterior distribution in overfitted mixture models
- Bayesian Density Estimation and Inference Using Mixtures
- Bayesian Repulsive Gaussian Mixture Model
- Combinatorial stochastic processes. Ecole d'Eté de Probabilités de Saint-Flour XXXII -- 2002.
- Density Estimation With Confidence Sets Exemplified by Superclusters and Voids in the Galaxies
- Exchangeable and partially exchangeable random partitions
- Finite mixture and Markov switching models.
- From here to infinity: sparse finite versus Dirichlet process mixtures in model-based clustering
- How many clusters?
- How many data clusters are in the galaxy data set? Bayesian cluster analysis in action
- Inconsistency of Pitman-Yor process mixtures for the number of components
- Markov chain Monte Carlo in approximate Dirichlet and beta two-parameter process hierarchical models
- Mixture models with a prior on the number of components
- Mixtures of Dirichlet processes with applications to Bayesian nonparametric problems
- Model-based clustering based on sparse finite Gaussian mixtures
- Modelling heterogeneity with and without the Dirichlet process
- On a loss-based prior for the number of components in mixture models
- On selecting a prior for the precision parameter of Dirichlet process mixture models
- On the posterior distribution of the number of components in a finite mixture
- Probabilistic Community Detection With Unknown Number of Communities
- Selecting the precision parameter prior in Dirichlet process mixture models
- Slice sampling mixture models
- Splitting and merging components of a nonconjugate Dirichlet process mixture model
- The two-parameter Poisson-Dirichlet distribution derived from a stable subordinator
Cited in
(5)
This page was built for publication: Generalized mixtures of finite mixtures and telescoping sampling
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q130362)