Efficient density estimation via piecewise polynomial approximation
From MaRDI portal
Publication:5259596
DOI10.1145/2591796.2591848zbMATH Open1315.68163arXiv1305.3207OpenAlexW2143122862MaRDI QIDQ5259596FDOQ5259596
Rocco A. Servedio, Xiaorui Sun, Ilias Diakonikolas, Siu On Chan
Publication date: 26 June 2015
Published in: Proceedings of the forty-sixth annual ACM symposium on Theory of computing (Search for Journal in Brave)
Abstract: We give a highly efficient "semi-agnostic" algorithm for learning univariate probability distributions that are well approximated by piecewise polynomial density functions. Let be an arbitrary distribution over an interval which is -close (in total variation distance) to an unknown probability distribution that is defined by an unknown partition of into intervals and unknown degree- polynomials specifying over each of the intervals. We give an algorithm that draws samples from , runs in time , and with high probability outputs a piecewise polynomial hypothesis distribution that is -close (in total variation distance) to . This sample complexity is essentially optimal; we show that even for , any algorithm that learns an unknown -piecewise degree- probability distribution over to accuracy must use samples from the distribution, regardless of its running time. Our algorithm combines tools from approximation theory, uniform convergence, linear programming, and dynamic programming. We apply this general algorithm to obtain a wide range of results for many natural problems in density estimation over both continuous and discrete domains. These include state-of-the-art results for learning mixtures of log-concave distributions; mixtures of -modal distributions; mixtures of Monotone Hazard Rate distributions; mixtures of Poisson Binomial Distributions; mixtures of Gaussians; and mixtures of -monotone densities. Our general technique yields computationally efficient algorithms for all these problems, in many cases with provably optimal sample complexities (up to logarithmic factors) in all parameters.
Full work available at URL: https://arxiv.org/abs/1305.3207
Learning and adaptive systems in artificial intelligence (68T05) Computational learning theory (68Q32)
Cites Work
- Theory of Cryptography
- The geometry of differential privacy: the small database and approximate cases
- On the geometry of differential privacy
- Interactive privacy via the median mechanism
- The price of privately releasing contingency tables and the spectra of random matrices with correlated rows
- Lower Bounds in Differential Privacy
- Iterative Constructions and Private Data Release
- Differential Privacy and the Fat-Shattering Dimension of Linear Queries
- Our Data, Ourselves: Privacy Via Distributed Noise Generation
- On the complexity of differentially private data release
- Title not available (Why is that?)
- Collusion-secure fingerprinting for digital data
- Advances in Cryptology – CRYPTO 2004
- Title not available (Why is that?)
- New Efficient Attacks on Statistical Disclosure Control Mechanisms
- Bounds on the Sample Complexity for Private Learning and Private Data Release
- Answering n {2+o(1)} counting queries with differential privacy is hard
- Characterizing the sample complexity of private learners
- Faster Algorithms for Privately Releasing Marginals
- Faster private release of marginals on small databases
- Private Learning and Sanitization: Pure vs. Approximate Differential Privacy
- Privately releasing conjunctions and the statistical query barrier
- Efficient algorithms for privately releasing marginals via convex relaxations
Cited In (13)
- Function approximation with polynomial membership functions and alternating cluster estimation
- Maximum a posteriori estimators as a limit of Bayes estimators
- Efficient and robust density estimation using Bernstein type polynomials
- Title not available (Why is that?)
- On the nonparametric maximum likelihood estimator for Gaussian location mixture densities with application to Gaussian denoising
- Estimation of exponential-polynomial distribution by holonomic gradient descent
- Piecewise linear approximation of empirical distributions under a Wasserstein distance constraint
- Reliable clustering of Bernoulli mixture models
- Robust Estimators in High-Dimensions Without the Computational Intractability
- Monte Carlo Gradient in Optimization Constrained by Radiative Transport Equation
- A least squares-type density estimator using a polynomial function
- Sampling Correctors
- Testing shape restrictions of discrete distributions
Uses Software
This page was built for publication: Efficient density estimation via piecewise polynomial approximation
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q5259596)