Two Remarks on Generalized Entropy Power Inequalities
From MaRDI portal
Publication:5115966
DOI10.1007/978-3-030-46762-3_7zbMath1453.60058arXiv1904.02314OpenAlexW2931763837MaRDI QIDQ5115966
Mokshay Madiman, Tomasz Tkocz, Piotr Nayar
Publication date: 21 August 2020
Published in: Lecture Notes in Mathematics (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1904.02314
Related Items (3)
Concentration of information content for convex measures ⋮ Weighted Brunn-Minkowski theory. I: On weighted surface area measures ⋮ Volumes of subset Minkowski sums and the Lyusternik region
Cites Work
- Unnamed Item
- A Mathematical Theory of Communication
- Rate of convergence and Edgeworth-type expansion in the entropic central limit theorem
- Do Minkowski averages get progressively more convex?
- Dimensional behaviour of entropy and information
- Gaussian kernels have only Gaussian maximizers
- Superadditivity of Fisher's information and logarithmic Sobolev inequalities
- A free analogue of Shannon's problem on monotonicity of entropy
- Entropy and the central limit theorem
- Optimal Young's inequality and its converse: A simple proof
- Rényi divergence and the central limit theorem
- The convexification effect of Minkowski summation
- Gaussian mixtures: entropy and geometric inequalities
- Majorization and Rényi entropy inequalities via Sperner theory
- Forward and reverse entropy power inequalities in convex geometry
- A reverse entropy power inequality for log-concave random vectors
- Entropy Power Inequality for the Rényi Entropy
- Beyond the Entropy Power Inequality, via Rearrangements
- Sumset and Inverse Sumset Inequalities for Differential Entropy and Mutual Information
- The Capacity Region of the Two-Receiver Gaussian Vector Broadcast Channel With Private and Common Messages
- Fractional generalizations of Young and Brunn-Minkowski inequalities
- Some inequalities satisfied by the quantities of information of Fisher and Shannon
- A Concise Guide to Complex Hadamard Matrices
- Monotonic Decrease of the Non-Gaussianness of the Sum of Independent Random Variables: A Simple Proof
- Generalized Entropy Power Inequalities and Monotonicity Properties of Information
- Entropy and Hadamard matrices
- Entropy Bounds on Abelian Groups and the Ruzsa Divergence
- A Strong Entropy Power Inequality
- Combinatorial Entropy Power Inequalities: A Preliminary Study of the Stam Region
- Solution of Shannon’s problem on the monotonicity of entropy
- A Hadamard matrix of order 428
This page was built for publication: Two Remarks on Generalized Entropy Power Inequalities