Variational inference and sparsity in high-dimensional deep Gaussian mixture models
From MaRDI portal
Abstract: Gaussian mixture models are a popular tool for model-based clustering, and mixtures of factor analyzers are Gaussian mixture models having parsimonious factor covariance structure for mixture components. There are several recent extensions of mixture of factor analyzers to deep mixtures, where the Gaussian model for the latent factors is replaced by a mixture of factor analyzers. This construction can be iterated to obtain a model with many layers. These deep models are challenging to fit, and we consider Bayesian inference using sparsity priors to further regularize the estimation. A scalable natural gradient variational inference algorithm is developed for fitting the model, and we suggest computationally efficient approaches to the architecture choice using overfitted mixtures where unnecessary components drop out in the estimation. In a number of simulated and two real examples, we demonstrate the versatility of our approach for high-dimensional problems, and demonstrate that the use of sparsity inducing priors can be helpful for obtaining improved clustering results.
Recommendations
- Variational Bayesian inference with Gaussian-mixture approximations
- Stochastic variational hierarchical mixture of sparse Gaussian processes for regression
- Convergence of sparse variational inference in Gaussian processes regression
- Variational Bayesian learning for parameter estimations of mixture of Gaussians
- Variational inference for sparse spectrum Gaussian process regression
- Deep variational inference
- Beyond Prediction: A Framework for Inference With Variational Approximations in Mixture Models
- Variational inference for Dirichlet process mixtures
Cites work
- scientific article; zbMATH DE number 6377992 (Why is no real title available?)
- Asymptotic behaviour of the posterior distribution in overfitted mixture models
- Automatic differentiation variational inference
- Deep Gaussian mixture models
- High-dimensional sparse factor modeling: applications in gene expression genomics
- Information theoretic measures for clusterings comparison: variants, properties, normalization and correction for chance
- Latent variable models and factor analysis. A unified approach
- Mixed Deep Gaussian Mixture Model: a clustering model for mixed datasets
- Model-based clustering based on sparse finite Gaussian mixtures
- Modelling high-dimensional data by mixtures of factor analyzers
- Scikit-learn: machine learning in Python
- Sparse Bayesian infinite factor models
- The horseshoe estimator for sparse signals
- Variational approximations in Bayesian model selection for finite mixture distributions
Cited in
(8)- Conditionally structured variational Gaussian approximation with importance weights
- Parsimonious ultrametric Gaussian mixture models
- scientific article; zbMATH DE number 6982332 (Why is no real title available?)
- scientific article; zbMATH DE number 7307467 (Why is no real title available?)
- Natural gradient hybrid variational inference with application to deep mixed models
- Deep Gaussian mixture models
- Sparse mixture models inspired by ANOVA decompositions
- Alpha-divergence minimization for deep Gaussian processes
This page was built for publication: Variational inference and sparsity in high-dimensional deep Gaussian mixture models
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2080343)