Statistical estimation of conditional Shannon entropy
DOI10.1051/ps/2018026zbMath1418.60026arXiv1804.08741OpenAlexW2963845195WikidataQ128813910 ScholiaQ128813910MaRDI QIDQ4967803
Alexey Alexandrovich Kozhevin, Alexander V. Bulinski
Publication date: 11 July 2019
Published in: ESAIM: Probability and Statistics (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1804.08741
Gaussian modellogistic regressionShannon entropyasymptotic unbiasedness\(L^2\)-consistencyconditional entropy estimates
Estimation in multivariate analysis (62H12) Asymptotic properties of nonparametric inference (62G20) Measures of information, entropy (94A17) (L^p)-limit theorems (60F25)
Related Items
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- A Mathematical Theory of Communication
- Limit theory for point processes in manifolds
- On the Kozachenko-Leonenko entropy estimator
- Concentration inequalities and model selection. Ecole d'Eté de Probabilités de Saint-Flour XXXIII -- 2003.
- Lectures on the nearest neighbor method
- Logistic regression. A self-learning text. With contributions by Erica Rihl Pryor
- Sample estimate of the entropy of a random vector
- On the estimation of entropy
- Efficient multivariate entropy estimation via \(k\)-nearest neighbour distances
- Statistical estimation of the Shannon entropy
- Entropy-based inhomogeneity detection in fiber materials
- Infinite family of approximations of the digamma function
- A Complete Proof of Universal Inequalities for the Distribution Function of the Binomial Law
- Bayesian Entropy Estimation for Countable Discrete Distributions
- $f$ -Divergence Inequalities
- A computationally efficient estimator for mutual information
- Estimation of Entropy and Mutual Information
- The different paths to entropy
- Ensemble Estimators for Multivariate Entropy Estimation
- Real Analysis
- Probability-1
This page was built for publication: Statistical estimation of conditional Shannon entropy