Edgeworth Approximation of Multivariate Differential Entropy
From MaRDI portal
Publication:5706652
DOI10.1162/0899766054323026zbMath1076.62013WikidataQ51966299 ScholiaQ51966299MaRDI QIDQ5706652
Publication date: 21 November 2005
Published in: Neural Computation (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1162/0899766054323026
62E17: Approximations to statistical distributions (nonasymptotic)
62B10: Statistical aspects of information-theoretic topics
Related Items
Information-Theoretic Representation Learning for Positive-Unlabeled Classification, Improved Approximation of the Sum of Random Vectors by the Skew Normal Distribution, Sufficient Dimension Reduction via Squared-Loss Mutual Information Estimation, Differential Log Likelihood for Evaluating and Learning Gaussian Mixtures, Expansions for log densities of multivariate estimates, Minimum mutual information and non-Gaussianity through the maximum entropy method: theory and properties, Particle-kernel estimation of the filter density in state-space models, Direct density-ratio estimation with dimensionality reduction via least-squares hetero-distributional subspace search, Parametric Bayesian estimation of differential entropy and relative entropy, Machine learning with squared-loss mutual information, Dimensionality reduction for density ratio estimation in high-dimensional spaces, Singular value decomposition of the third multivariate moment, Detecting direct associations in a network by information theoretic approaches, Normality-based validation for crisp clustering
Cites Work
- A Mathematical Theory of Communication
- Projection pursuit
- Independent component analysis, a new concept?
- What is Projection Pursuit?
- A nonparametric estimation of the entropy for absolutely continuous distributions (Corresp.)
- Estimation of Entropy and Mutual Information
- Limit theorems for sums of general functions of m-spacings
- Adaptive Blind Deconvolution of Linear Channels Using Renyi's Entropy with Parzen Window Estimation