GAT–GMM: Generative Adversarial Training for Gaussian Mixture Models
DOI10.1137/21M1445831OpenAlexW3036353678MaRDI QIDQ5885836
Subhro Das, Farzan Farnia, Ali Jadbabaie, Unnamed Author
Publication date: 30 March 2023
Published in: SIAM Journal on Mathematics of Data Science (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/2006.10293
optimal transportGaussian mixture modelsgenerative adversarial networksfederated learningminimax learning
Classification and discrimination; cluster analysis (statistical aspects) (62H30) Minimax problems in mathematical programming (90C47) Learning and adaptive systems in artificial intelligence (68T05) Optimal transportation (49Q22)
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Statistical guarantees for the EM algorithm: from population to sample-based analysis
- Minimum Hellinger distance estimates for parametric models
- Learning mixtures of separated nonspherical Gaussians
- Optimal estimation of Gaussian mixtures via denoised method of moments
- Sharp asymptotic and finite-sample rates of convergence of empirical measures in Wasserstein distance
- Optimal transport for applied mathematicians. Calculus of variations, PDEs, and modeling
- Efficiently learning mixtures of two Gaussians
- Tight Bounds for Learning a Mixture of Two Gaussians
- Learning Mixtures of Gaussians in High Dimensions
- Learning mixtures of spherical gaussians
- The Minimum Distance Method
- A Wasserstein-Type Distance in the Space of Gaussian Mixture Models
- On parameter estimation with the Wasserstein distance
- On a mixture of Brenier and Strassen Theorems
- Learning mixtures of arbitrary gaussians
- Mixture models, robustness, and sum of squares proofs
- Approximate Bayesian Computation with the Wasserstein Distance
- Estimating Divergence Functionals and the Likelihood Ratio by Convex Risk Minimization
This page was built for publication: GAT–GMM: Generative Adversarial Training for Gaussian Mixture Models