Learning mixtures of structured distributions over discrete domains

From MaRDI portal
Publication:5741809

DOI10.1137/1.9781611973105.100zbMATH Open1421.68144arXiv1210.0864OpenAlexW1763164889MaRDI QIDQ5741809FDOQ5741809


Authors: Siu On Chan, Ilias Diakonikolas, Xiaorui Sun, Rocco A. Servedio Edit this on Wikidata


Publication date: 15 May 2019

Published in: Proceedings of the Twenty-Fourth Annual ACM-SIAM Symposium on Discrete Algorithms (Search for Journal in Brave)

Abstract: Let mathfrakC be a class of probability distributions over the discrete domain [n]=1,...,n. We show that if mathfrakC satisfies a rather general condition -- essentially, that each distribution in mathfrakC can be well-approximated by a variable-width histogram with few bins -- then there is a highly efficient (both in terms of running time and sample complexity) algorithm that can learn any mixture of k unknown distributions from mathfrakC. We analyze several natural types of distributions over [n], including log-concave, monotone hazard rate and unimodal distributions, and show that they have the required structural property of being well-approximated by a histogram with few bins. Applying our general algorithm, we obtain near-optimally efficient algorithms for all these mixture learning problems.


Full work available at URL: https://arxiv.org/abs/1210.0864




Recommendations




Cited In (12)





This page was built for publication: Learning mixtures of structured distributions over discrete domains

Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q5741809)