Sparse mixture models inspired by ANOVA decompositions
From MaRDI portal
Abstract: Inspired by the analysis of variance (ANOVA) decomposition of functions we propose a Gaussian-Uniform mixture model on the high-dimensional torus which relies on the assumption that the function we wish to approximate can be well explained by limited variable interactions. We consider three approaches, namely wrapped Gaussians, diagonal wrapped Gaussians and products of von Mises distributions. The sparsity of the mixture model is ensured by the fact that its summands are products of Gaussian-like density functions acting on low dimensional spaces and uniform probability densities defined on the remaining directions. To learn such a sparse mixture model from given samples, we propose an objective function consisting of the negative log-likelihood function of the mixture model and a regularizer that penalizes the number of its summands. For minimizing this functional we combine the Expectation Maximization algorithm with a proximal step that takes the regularizer into account. To decide which summands of the mixture model are important, we apply a Kolmogorov-Smirnov test. Numerical examples demonstrate the performance of our approach.
Recommendations
Cites work
- scientific article; zbMATH DE number 5430929 (Why is no real title available?)
- scientific article; zbMATH DE number 5769797 (Why is no real title available?)
- scientific article; zbMATH DE number 1375577 (Why is no real title available?)
- scientific article; zbMATH DE number 3456704 (Why is no real title available?)
- scientific article; zbMATH DE number 3567782 (Why is no real title available?)
- scientific article; zbMATH DE number 2117879 (Why is no real title available?)
- scientific article; zbMATH DE number 7370625 (Why is no real title available?)
- scientific article; zbMATH DE number 7192338 (Why is no real title available?)
- A Wasserstein-Type Distance in the Space of Gaussian Mixture Models
- A multivariate von mises distribution with applications to bioinformatics
- A near-stationary subspace for ridge approximation
- Active subspace methods in theory and practice: applications to kriging surfaces
- An introduction to MCMC for machine learning
- Approximation of functions of few variables in high dimensions
- Approximation of high-dimensional periodic functions with Fourier-based methods
- Bayesian analysis for bivariate von Mises distributions
- Compressive statistical learning with random feature moments
- Estimating Mean Dimensionality of Analysis of Variance Decompositions
- Experiments. Planning, analysis and optimization.
- Fast high-dimensional approximation with sparse occupancy trees
- Finite mixture models
- High-dimensional data clustering
- High-dimensional mixture models for unsupervised image denoising (HDMI)
- Identifiability of Mixtures
- Identifiability of Mixtures of Product Measures
- Identifiability of finite mixtures of von Mises distributions
- Kullback proximal algorithms for maximum-likelihood estimation
- Learning functions of few arbitrary linear parameters in high dimensions
- Multivariate regression and machine learning with sums of separable functions
- Numerical methods of statistics.
- On EM algorithms and their proximal generalizations
- On decompositions of multivariate functions
- On the Identifiability of Finite Mixtures
- Protein Bioinformatics and Mixtures of Bivariate von Mises Distributions for Angular Data
- Sketched learning for image denoising
- Smoothing spline ANOVA models
- Sparse grid quadrature in high dimensions with applications in finance and insurance
- The analysis of directional time series: Applications to wind speed and direction. (Based on the author's thesis, Univ. of Western Australia in Perth)
- Topics in circular statistics. With 1 IBM-PC floppy disk (3. 5 inch, HD)
Cited in
(2)
This page was built for publication: Sparse mixture models inspired by ANOVA decompositions
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2071472)