On the nonparametric maximum likelihood estimator for Gaussian location mixture densities with application to Gaussian denoising
From MaRDI portal
Publication:2196192
Abstract: We study the Nonparametric Maximum Likelihood Estimator (NPMLE) for estimating Gaussian location mixture densities in -dimensions from independent observations. Unlike usual likelihood-based methods for fitting mixtures, NPMLEs are based on convex optimization. We prove finite sample results on the Hellinger accuracy of every NPMLE. Our results imply, in particular, that every NPMLE achieves near parametric risk (up to logarithmic multiplicative factors) when the true density is a discrete Gaussian mixture without any prior information on the number of mixture components. NPMLEs can naturally be used to yield empirical Bayes estimates of the Oracle Bayes estimator in the Gaussian denoising problem. We prove bounds for the accuracy of the empirical Bayes estimate as an approximation to the Oracle Bayes estimator. Here our results imply that the empirical Bayes estimator performs at nearly the optimal level (up to logarithmic multiplicative factors) for denoising in clustering situations without any prior knowledge of the number of clusters.
Recommendations
- Adaptive density estimation for clustering with Gaussian mixtures
- A Smooth Nonparametric Estimate of a Mixing Distribution Using Mixtures of Gaussians
- Generalized maximum likelihood estimation of normal mixture densities
- Approximate nonparametric maximum likelihood for mixture models: a convex optimization approach to fitting arbitrary multivariate mixing distributions
- Maximum likelihood estimation of a class of non-Gaussian densities with application to I/sub p/ deconvolution
Cites work
- scientific article; zbMATH DE number 3124366 (Why is no real title available?)
- scientific article; zbMATH DE number 3731128 (Why is no real title available?)
- scientific article; zbMATH DE number 3567782 (Why is no real title available?)
- scientific article; zbMATH DE number 1059776 (Why is no real title available?)
- scientific article; zbMATH DE number 1064642 (Why is no real title available?)
- scientific article; zbMATH DE number 3068128 (Why is no real title available?)
- A new algorithm and theory for penalized regression-based clustering
- A non asymptotic penalized criterion for Gaussian mixture model selection
- A review of reliable maximum likelihood algorithms for semiparametric mixture models
- A review of semiparametric mixture models
- Adaptive density estimation for clustering with Gaussian mixtures
- Admissible Estimators, Recurrent Diffusions, and Insoluble Boundary Value Problems
- Combinatorial methods in density estimation
- Concentration inequalities and model selection. Ecole d'Eté de Probabilités de Saint-Flour XXXIII -- 2003.
- Consistency of the Maximum Likelihood Estimator in the Presence of Infinitely Many Incidental Parameters
- Convex Optimization, Shape Constraints, Compound Decisions, and Empirical Bayes Rules
- Data-driven penalty calibration: a case study for Gaussian mixture model selection
- Efficient density estimation via piecewise polynomial approximation
- Empirical Bayes on vector observations: An extension of Stein's method
- Entropies and rates of convergence for maximum likelihood and Bayes estimation for mixtures of normal densities.
- Estimating the number of clusters in a data set via the gap statistic
- Estimation of the mean of a multivariate normal distribution
- Finite mixture models
- General maximum likelihood empirical Bayes estimation of normal means
- Generalized maximum likelihood estimation of normal mixture densities
- Group-linear empirical Bayes estimates for a heteroscedastic normal mean
- High-dimensional classification via nonparametric empirical Bayes and maximum likelihood inference
- Learning mixtures of structured distributions over discrete domains
- Medical applications of finite mixture models
- Mixture models: theory, geometry and applications
- Multivariate empirical Bayes and estimation of covariance matrices
- Needles and straw in haystacks: Empirical Bayes estimates of possibly sparse sequences
- Nonparametric Maximum Likelihood Estimation of a Mixing Distribution
- Nonparametric empirical Bayes and compound decision approaches to estimation of a high-dimensional vector of normal means
- Probability inequalities for likelihood ratios and convergence rates of sieve MLEs
- Risk bounds for model selection via penalization
- SURE Estimates for a Heteroscedastic Hierarchical Model
- Sample-optimal density estimation in nearly-linear time
- Sparse Convex Clustering
- Statistical analysis of finite mixture distributions
- Statistical guarantees for the EM algorithm: from population to sample-based analysis
- Statistical properties of convex clustering
- The EM Algorithm and Related Statistical Models
- The Empirical Bayes Approach to Statistical Decision Problems
- The geometry of mixture likelihoods, part II: The exponential family
- The geometry of mixture likelihoods: A general theory
- Tweedie’s Formula and Selection Bias
- Weak convergence and empirical processes. With applications to statistics
Cited in
(27)- Learning Gaussian mixtures using the Wasserstein-Fisher-Rao gradient flow
- Empirical Bayes Mean Estimation With Nonparametric Errors Via Order Statistic Regression on Replicated Data
- Fisher-Pitman Permutation Tests Based on Nonparametric Poisson Mixtures with Application to Single Cell Genomics
- High-dimensional linear discriminant analysis using nonparametric methods
- Nonparametric empirical Bayes biomarker imputation and estimation
- Maximum likelihood estimation of a class of non-Gaussian densities with application to I/sub p/ deconvolution
- Poisson mean vector estimation with nonparametric maximum likelihood estimation and application to protein domain data
- Randomly initialized EM algorithm for two-component Gaussian mixture achieves near optimality in \(O(\sqrt{n})\) iterations
- Set-convergence and its application: a tutorial
- A nonparametric empirical Bayes approach to large-scale multivariate regression
- Implicit models of Gaussian mixture densities and locally optimum detectors
- No need for an oracle: the nonparametric maximum likelihood decision in the compound decision problem is minimax
- Optimal estimation of Gaussian mixtures via denoised method of moments
- Entropy regularization in probabilistic clustering
- Optimal estimation of high-dimensional Gaussian location mixtures
- High dimensional discriminant rules with shrinkage estimators of the covariance matrix and mean vector
- Invidious comparisons: ranking and selection as compound decisions
- scientific article; zbMATH DE number 7300702 (Why is no real title available?)
- Non-parametric likelihood based channel estimator for Gaussian mixture noise
- Generalized maximum likelihood estimation of the mean of parameters of mixtures. With applications to sampling and to observational studies
- A Compound Decision Approach to Covariance Matrix Estimation
- Approximate nonparametric maximum likelihood for mixture models: a convex optimization approach to fitting arbitrary multivariate mixing distributions
- Likelihood Maximization and Moment Matching in Low <scp>SNR</scp> Gaussian Mixture Models
- Uniform consistency in nonparametric mixture models
- Conditional Expected Likelihood Technique for Compound Gaussian and Gaussian Distributed Noise Mixtures
- Simultaneous estimation of normal means with side information
- Minimax bounds for estimating multivariate Gaussian location mixtures
This page was built for publication: On the nonparametric maximum likelihood estimator for Gaussian location mixture densities with application to Gaussian denoising
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2196192)