A New Entropy Power Inequality for Integer-Valued Random Variables
From MaRDI portal
Publication:2986249
Abstract: The entropy power inequality (EPI) provides lower bounds on the differential entropy of the sum of two independent real-valued random variables in terms of the individual entropies. Versions of the EPI for discrete random variables have been obtained for special families of distributions with the differential entropy replaced by the discrete entropy, but no universal inequality is known (beyond trivial ones). More recently, the sumset theory for the entropy function provides a sharp inequality when are i.i.d. with high entropy. This paper provides the inequality , where are arbitrary i.i.d. integer-valued random variables and where is a universal strictly positive function on satisfying . Extensions to non identically distributed random variables and to conditional entropies are also obtained.
Cited in
(8)- Concentration functions and entropy bounds for discrete log-concave distributions
- The convexification effect of Minkowski summation
- Entropy inequalities for sums in prime cyclic groups
- Majorization and Rényi entropy inequalities via Sperner theory
- Entropy-variance inequalities for discrete log-concave random variables via degree of freedom
- Approximate discrete entropy monotonicity for log-concave sums
- Bernoulli sums and Rényi entropy inequalities
- Entropy and the discrete central limit theorem
This page was built for publication: A New Entropy Power Inequality for Integer-Valued Random Variables
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2986249)