Two Remarks on Generalized Entropy Power Inequalities
From MaRDI portal
Publication:5115966
Abstract: This note contributes to the understanding of generalized entropy power inequalities. Our main goal is to construct a counter-example regarding monotonicity and entropy comparison of weighted sums of independent identically distributed log-concave random variables. We also present a complex analogue of a recent dependent entropy power inequality of Hao and Jog, and give a very simple proof.
Recommendations
- A generalization of the entropy power inequality with applications
- A Generalization of Costa’s Entropy Power Inequality
- On Rényi Entropy Power Inequalities
- A new entropy power inequality
- On a generalization of the entropy inequality
- Variants of the Entropy Power Inequality
- A Strong Entropy Power Inequality
- A Vector Generalization of Costa's Entropy-Power Inequality With Applications
- Some inequalities on generalized entropies
- On the similarity of the entropy power inequality and the Brunn- Minkowski inequality (Corresp.)
Cites work
- scientific article; zbMATH DE number 2131215 (Why is no real title available?)
- A Concise Guide to Complex Hadamard Matrices
- A Hadamard matrix of order 428
- A Mathematical Theory of Communication
- A Strong Entropy Power Inequality
- A free analogue of Shannon's problem on monotonicity of entropy
- A reverse entropy power inequality for log-concave random vectors
- Beyond the Entropy Power Inequality, via Rearrangements
- Combinatorial Entropy Power Inequalities: A Preliminary Study of the Stam Region
- Dimensional behaviour of entropy and information
- Do Minkowski averages get progressively more convex?
- Entropy Bounds on Abelian Groups and the Ruzsa Divergence
- Entropy Power Inequality for the Rényi Entropy
- Entropy and Hadamard matrices
- Entropy and the central limit theorem
- Forward and reverse entropy power inequalities in convex geometry
- Fractional generalizations of Young and Brunn-Minkowski inequalities
- Gaussian kernels have only Gaussian maximizers
- Gaussian mixtures: entropy and geometric inequalities
- Generalized Entropy Power Inequalities and Monotonicity Properties of Information
- Majorization and Rényi entropy inequalities via Sperner theory
- Monotonic Decrease of the Non-Gaussianness of the Sum of Independent Random Variables: A Simple Proof
- Optimal Young's inequality and its converse: A simple proof
- Rate of convergence and Edgeworth-type expansion in the entropic central limit theorem
- Rényi divergence and the central limit theorem
- Solution of Shannon’s problem on the monotonicity of entropy
- Some inequalities satisfied by the quantities of information of Fisher and Shannon
- Sumset and Inverse Sumset Inequalities for Differential Entropy and Mutual Information
- Superadditivity of Fisher's information and logarithmic Sobolev inequalities
- The Capacity Region of the Two-Receiver Gaussian Vector Broadcast Channel With Private and Common Messages
- The convexification effect of Minkowski summation
Cited in
(6)- Weighted Brunn-Minkowski theory. I: On weighted surface area measures
- Concentration of information content for convex measures
- Beyond the Entropy Power Inequality, via Rearrangements
- On the similarity of the entropy power inequality and the Brunn- Minkowski inequality (Corresp.)
- On the entropy and information of Gaussian mixtures
- Volumes of subset Minkowski sums and the Lyusternik region
This page was built for publication: Two Remarks on Generalized Entropy Power Inequalities
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q5115966)