The Entropy Per Coordinate of a Random Vector is Highly Constrained Under Convexity Conditions
From MaRDI portal
Publication:5272404
DOI10.1109/TIT.2011.2158475zbMath1365.94135arXiv1006.2883MaRDI QIDQ5272404
Mokshay Madiman, Sergey G. Bobkov
Publication date: 12 July 2017
Published in: IEEE Transactions on Information Theory (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1006.2883
Infinitely divisible distributions; stable distributions (60E07) Inequalities; stochastic orderings (60E15) Measures of information, entropy (94A17)
Related Items
Entropy-based test for generalised Gaussian distributions ⋮ Rogers-Shephard inequality for log-concave functions ⋮ Log-concavity, ultra-log-concavity, and a maximum entropy property of discrete compound Poisson measures ⋮ On the equivalence of statistical distances for isotropic convex measures ⋮ Concentration of information content for convex measures ⋮ Optimal Concentration of Information Content for Log-Concave Densities ⋮ Entropy-variance inequalities for discrete log-concave random variables via degree of freedom ⋮ Reverse Brunn-Minkowski and reverse entropy power inequalities for convex measures ⋮ Reliability and expectation bounds based on Hardy’s inequality ⋮ Unnamed Item ⋮ Dimensional behaviour of entropy and information ⋮ Norms of weighted sums of log-concave random vectors ⋮ Bernoulli sums and Rényi entropy inequalities ⋮ Dimensional variance inequalities of Brascamp-Lieb type and a local approach to dimensional Prékopa's theorem ⋮ Rényi entropy power inequality and a reverse ⋮ Concentration of the information in data with log-concave distributions ⋮ A Combinatorial Approach to Small Ball Inequalities for Sums and Differences ⋮ Unnamed Item ⋮ Rogers-Shephard and local Loomis-Whitney type inequalities ⋮ A reverse entropy power inequality for log-concave random vectors ⋮ On the Problem of Reversibility of the Entropy Power Inequality ⋮ Maximum-a-Posteriori Estimation with Bayesian Confidence Regions