PMOG: The projected mixture of Gaussians model with application to blind source separation

From MaRDI portal
Publication:426657

DOI10.1016/J.NEUNET.2011.12.005zbMATH Open1239.62070DBLPjournals/nn/Pendse12arXiv1008.2743OpenAlexW1496493257WikidataQ39644735 ScholiaQ39644735MaRDI QIDQ426657FDOQ426657


Authors: Gautam V. Pendse Edit this on Wikidata


Publication date: 11 June 2012

Published in: Neural Networks (Search for Journal in Brave)

Abstract: We extend the mixtures of Gaussians (MOG) model to the projected mixture of Gaussians (PMOG) model. In the PMOG model, we assume that q dimensional input data points z_i are projected by a q dimensional vector w into 1-D variables u_i. The projected variables u_i are assumed to follow a 1-D MOG model. In the PMOG model, we maximize the likelihood of observing u_i to find both the model parameters for the 1-D MOG as well as the projection vector w. First, we derive an EM algorithm for estimating the PMOG model. Next, we show how the PMOG model can be applied to the problem of blind source separation (BSS). In contrast to conventional BSS where an objective function based on an approximation to differential entropy is minimized, PMOG based BSS simply minimizes the differential entropy of projected sources by fitting a flexible MOG model in the projected 1-D space while simultaneously optimizing the projection vector w. The advantage of PMOG over conventional BSS algorithms is the more flexible fitting of non-Gaussian source densities without assuming near-Gaussianity (as in conventional BSS) and still retaining computational feasibility.


Full work available at URL: https://arxiv.org/abs/1008.2743




Recommendations




Cites Work


Uses Software





This page was built for publication: PMOG: The projected mixture of Gaussians model with application to blind source separation

Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q426657)