A concavity property for the reciprocal of Fisher information and its consequences on Costa's EPI
From MaRDI portal
Publication:1618481
DOI10.1016/J.PHYSA.2015.03.018zbMATH Open1400.94103arXiv1410.2722OpenAlexW2075360622MaRDI QIDQ1618481FDOQ1618481
Authors: G. Toscani
Publication date: 13 November 2018
Published in: Physica A (Search for Journal in Brave)
Abstract: We prove that the reciprocal of Fisher information of a log-concave probability density in is concave in with respect to the addition of a Gaussian noise . As a byproduct of this result we show that the third derivative of the entropy power of a log-concave probability density in is nonnegative in with respect to the addition of a Gaussian noise . For log-concave densities this improves the well-known Costa's concavity property of the entropy power.
Full work available at URL: https://arxiv.org/abs/1410.2722
Recommendations
- Further investigations of Rényi entropy power inequalities and an entropic characterization of \(s\)-concave densities
- A new entropy power inequality
- On the Problem of Reversibility of the Entropy Power Inequality
- INEQUALITIES FOR THE DEPENDENT GAUSSIAN NOISE CHANNELS BASED ON FISHER INFORMATION AND COPULAS
- A short proof of the "concavity of entropy power"
Fisher informationentropy-power inequalitylog-concave functionsBlachman-Stam inequalityCSosta's concavity property
Cites Work
- A Mathematical Theory of Communication
- Title not available (Why is that?)
- Inequalities: theory of majorization and its applications
- The Concavity of Rényi Entropy Power
- A new entropy power inequality
- Rényi entropies and nonlinear diffusion equations
- A short proof of the "concavity of entropy power"
- Information theoretic inequalities
- Speed of approach to equilibrium for Kac's caricature of a Maxwellian gas
- Information Theoretic Proofs of Entropy Power Inequalities
- Some inequalities satisfied by the quantities of information of Fisher and Shannon
- Mutual Information and Minimum Mean-Square Error in Gaussian Channels
- A simple proof of the entropy-power inequality
- The convolution inequality for entropy powers
- An information-theoretic proof of Nash's inequality
- A strengthened central limit theorem for smooth densities
- A generalization of the entropy power inequality with applications
- Heat equation and convolution inequalities
- A Strengthened Entropy Power Inequality for Log-Concave Densities
- Higher Order Derivatives in Costa’s Entropy Power Inequality
Cited In (4)
This page was built for publication: A concavity property for the reciprocal of Fisher information and its consequences on Costa's EPI
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q1618481)