Two Remarks on Generalized Entropy Power Inequalities
From MaRDI portal
Publication:5115966
DOI10.1007/978-3-030-46762-3_7zbMATH Open1453.60058arXiv1904.02314OpenAlexW2931763837MaRDI QIDQ5115966FDOQ5115966
Mokshay Madiman, Piotr Nayar, Tomasz Tkocz
Publication date: 21 August 2020
Published in: Lecture Notes in Mathematics (Search for Journal in Brave)
Abstract: This note contributes to the understanding of generalized entropy power inequalities. Our main goal is to construct a counter-example regarding monotonicity and entropy comparison of weighted sums of independent identically distributed log-concave random variables. We also present a complex analogue of a recent dependent entropy power inequality of Hao and Jog, and give a very simple proof.
Full work available at URL: https://arxiv.org/abs/1904.02314
Recommendations
- A generalization of the entropy power inequality with applications
- A Generalization of Costa’s Entropy Power Inequality
- On Rényi Entropy Power Inequalities
- A new entropy power inequality
- On a generalization of the entropy inequality
- Variants of the Entropy Power Inequality
- A Strong Entropy Power Inequality
- A Vector Generalization of Costa's Entropy-Power Inequality With Applications
- Some inequalities on generalized entropies
- On the similarity of the entropy power inequality and the Brunn- Minkowski inequality (Corresp.)
Cites Work
- A Mathematical Theory of Communication
- Title not available (Why is that?)
- A Concise Guide to Complex Hadamard Matrices
- Superadditivity of Fisher's information and logarithmic Sobolev inequalities
- Solution of Shannon’s problem on the monotonicity of entropy
- Entropy and the central limit theorem
- Fractional generalizations of Young and Brunn-Minkowski inequalities
- Some inequalities satisfied by the quantities of information of Fisher and Shannon
- Gaussian kernels have only Gaussian maximizers
- Monotonic Decrease of the Non-Gaussianness of the Sum of Independent Random Variables: A Simple Proof
- Generalized Entropy Power Inequalities and Monotonicity Properties of Information
- Optimal Young's inequality and its converse: A simple proof
- Rate of convergence and Edgeworth-type expansion in the entropic central limit theorem
- A Hadamard matrix of order 428
- Dimensional behaviour of entropy and information
- Gaussian mixtures: entropy and geometric inequalities
- The convexification effect of Minkowski summation
- Do Minkowski averages get progressively more convex?
- The Capacity Region of the Two-Receiver Gaussian Vector Broadcast Channel With Private and Common Messages
- Rényi divergence and the central limit theorem
- Sumset and Inverse Sumset Inequalities for Differential Entropy and Mutual Information
- Entropy Bounds on Abelian Groups and the Ruzsa Divergence
- A free analogue of Shannon's problem on monotonicity of entropy
- Forward and reverse entropy power inequalities in convex geometry
- Entropy and Hadamard matrices
- A reverse entropy power inequality for log-concave random vectors
- Beyond the Entropy Power Inequality, via Rearrangements
- Combinatorial Entropy Power Inequalities: A Preliminary Study of the Stam Region
- Entropy Power Inequality for the Rényi Entropy
- Majorization and Rényi entropy inequalities via Sperner theory
- A Strong Entropy Power Inequality
Cited In (6)
- Weighted Brunn-Minkowski theory. I: On weighted surface area measures
- Concentration of information content for convex measures
- Beyond the Entropy Power Inequality, via Rearrangements
- On the similarity of the entropy power inequality and the Brunn- Minkowski inequality (Corresp.)
- On the entropy and information of Gaussian mixtures
- Volumes of subset Minkowski sums and the Lyusternik region
This page was built for publication: Two Remarks on Generalized Entropy Power Inequalities
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q5115966)