Variational Bayes approach for model aggregation in unsupervised classification with Markovian dependency
From MaRDI portal
Abstract: We consider a binary unsupervised classification problem where each observation is associated with an unobserved label that we want to retrieve. More precisely, we assume that there are two groups of observation: normal and abnormal. The `normal' observations are coming from a known distribution whereas the distribution of the `abnormal' observations is unknown. Several models have been developed to fit this unknown distribution. In this paper, we propose an alternative based on a mixture of Gaussian distributions. The inference is done within a variational Bayesian framework and our aim is to infer the posterior probability of belonging to the class of interest. To this end, it makes no sense to estimate the mixture component number since each mixture model provides more or less relevant information to the posterior probability estimation. By computing a weighted average (named aggregated estimator) over the model collection, Bayesian Model Averaging (BMA) is one way of combining models in order to account for information provided by each model. The aim is then the estimation of the weights and the posterior probability for one specific model. In this work, we derive optimal approximations of these quantities from the variational theory and propose other approximations of the weights. To perform our method, we consider that the data are dependent (Markovian dependency) and hence we consider a Hidden Markov Model. A simulation study is carried out to evaluate the accuracy of the estimates in terms of classification. We also present an application to the analysis of public health surveillance systems.
Recommendations
- Variational Bayesian analysis for hidden Markov models
- Hidden Markov models with mixtures as emission distributions
- Bayesian unsupervised classification framework based on stochastic partitions of data and a parallel search strategy
- Variational approximations in Bayesian model selection for finite mixture distributions
- Variational Bayesian approximation method for classification and clustering with a mixture of Student-\(t\) model
Cites work
- scientific article; zbMATH DE number 1222284 (Why is no real title available?)
- scientific article; zbMATH DE number 2117879 (Why is no real title available?)
- A Maximization Technique Occurring in the Statistical Analysis of Probabilistic Functions of Markov Chains
- A semi-parametric approach for mixture models: application to local false discovery rate estimation
- An introduction to MCMC for machine learning
- Bayesian Model Averaging in Proportional Hazard Models: Assessing the Risk of a Stroke
- Bayesian model averaging: A tutorial. (with comments and a rejoinder).
- Estimating the dimension of a model
- Graphical models, exponential families, and variational inference
- Large-scale multiple testing under dependence
- Model Selection and Accounting for Model Uncertainty in Graphical Models Using Occam's Window
- On efficient calculations for Bayesian variable selection
- Statistical field theory. With a foreword by David Pines
- Unsupervised classification for tiling arrays: chip-chip and transcriptome
- Variational Bayesian analysis for hidden Markov models
- Variational Bayesian methods for spatial data analysis
Cited in
(5)- Improved model-based clustering performance using Bayesian initialization averaging
- Polarization of forecast densities: a new approach to time series classification
- Variational Bayes model averaging for graphon functions and motif frequencies inference in \(W\)-graph models
- Goodness of Fit of Logistic Regression Models for Random Graphs
- On variational Bayes estimation and variational information criteria for linear regression models
This page was built for publication: Variational Bayes approach for model aggregation in unsupervised classification with Markovian dependency
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q693245)