Entropy estimation via uniformization
From MaRDI portal
Publication:6136099
Abstract: Entropy estimation is of practical importance in information theory and statistical science. Many existing entropy estimators suffer from fast growing estimation bias with respect to dimensionality, rendering them unsuitable for high-dimensional problems. In this work we propose a transform-based method for high-dimensional entropy estimation, which consists of the following two main ingredients. First by modifying the k-NN based entropy estimator, we propose a new estimator which enjoys small estimation bias for samples that are close to a uniform distribution. Second we design a normalizing flow based mapping that pushes samples toward a uniform distribution, and the relation between the entropy of the original samples and the transformed ones is also derived. As a result the entropy of a given set of samples is estimated by first transforming them toward a uniform distribution and then applying the proposed estimator to the transformed samples. The performance of the proposed method is compared against several existing entropy estimators, with both mathematical examples and real-world applications.
Recommendations
- Efficient multivariate entropy estimation via \(k\)-nearest neighbour distances
- The use of power transformations for improved entropy estimation
- Estimation of Entropy and Mutual Information
- On the Estimation of Differential Entropy From Data Located on Embedded Manifolds
- Testing uniformity based on new entropy estimators
Cites work
- scientific article; zbMATH DE number 3868406 (Why is no real title available?)
- scientific article; zbMATH DE number 53542 (Why is no real title available?)
- scientific article; zbMATH DE number 3518103 (Why is no real title available?)
- scientific article; zbMATH DE number 1092005 (Why is no real title available?)
- scientific article; zbMATH DE number 922427 (Why is no real title available?)
- scientific article; zbMATH DE number 7370574 (Why is no real title available?)
- A Mathematical Theory of Communication
- A new class of random vector entropy estimators and its applications in testing statistical hypotheses
- Combinatorics of partial derivatives
- Demystifying Fixed <inline-formula> <tex-math notation="LaTeX">$k$ </tex-math> </inline-formula>-Nearest Neighbor Information Estimators
- Density-free convergence properties of various estimators of entropy
- Efficient multivariate entropy estimation via \(k\)-nearest neighbour distances
- Ensemble Estimators for Multivariate Entropy Estimation
- Estimation of entropy and other functionals of a multivariate density
- Estimation of integral functionals of a density
- Geometric k-nearest neighbor estimation of entropy and mutual information
- Lectures on the nearest neighbor method
- Maximum Entropy Sampling and Optimal Bayesian Experimental Design
- Minimum-entropy estimation in semi-parametric models
- On the estimation of entropy
- Optimal rates of entropy estimation over Lipschitz balls
- SENSITIVITY ANALYSIS FOR STOCHASTIC SIMULATORS USING DIFFERENTIAL ENTROPY
- Sample estimate of the entropy of a random vector
- The jackknife estimate of variance
- Towards Bayesian experimental design for nonlinear models that require a large number of sampling times
This page was built for publication: Entropy estimation via uniformization
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q6136099)