Solution of Shannon’s problem on the monotonicity of entropy
From MaRDI portal
Publication:4821035
DOI10.1090/S0894-0347-04-00459-XzbMath1062.94006OpenAlexW1544500240WikidataQ56070029 ScholiaQ56070029MaRDI QIDQ4821035
Assaf Naor, Franck Barthe, Keith M. Ball, Shiri Artstein-Avidan
Publication date: 7 October 2004
Published in: Journal of the American Mathematical Society (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1090/s0894-0347-04-00459-x
Measures of information, entropy (94A17) Statistical aspects of information-theoretic topics (62B10)
Related Items
Gaussian approximations in high dimensional estimation, The (B) conjecture for the Gaussian measure of dilates of symmetric convex sets and related problems, The fractional Fisher information and the central limit theorem for stable laws, A free analogue of Shannon's problem on monotonicity of entropy, Rate of convergence and Edgeworth-type expansion in the entropic central limit theorem, Natural selection as coarsening, Bounds on the Poincaré constant for convolution measures, Log-concavity, ultra-log-concavity, and a maximum entropy property of discrete compound Poisson measures, Gaussian optimizers for entropic inequalities in quantum information, Information in Probability: Another Information-Theoretic Proof of a Finite de Finetti Theorem, Reverse Brunn-Minkowski and reverse entropy power inequalities for convex measures, Log-Hessian and deviation bounds for Markov semi-groups, and regularization effect in \(\mathbb{L}^1 \), Convex geometry and its connections to harmonic analysis, functional analysis and probability theory, Entropy and the discrete central limit theorem, Volumes of subset Minkowski sums and the Lyusternik region, Dynamical Gibbs variational principles for irreversible interacting particle systems with applications to attractor properties, Stein's density method for multivariate continuous distributions, Two Remarks on Generalized Entropy Power Inequalities, Entropy and monotonicity in artificial intelligence, Unnamed Item, Shannon's monotonicity problem for free and classical entropy, Higher-order Stein kernels for Gaussian approximation, Rényi divergence and the central limit theorem, Log-concavity and strong log-concavity: a review, Entropy inequalities for stable densities and strengthened central limit theorems, Fisher information and the central limit theorem, The convexification effect of Minkowski summation, A local proof of the dimensional Prékopa's theorem, Entropy and the fourth moment phenomenon, Contribution to the theory of Pitman estimators, On Berndtsson's generalization of Prékopa's theorem, Sumset and Inverse Sumset Theory for Shannon Entropy, The convergence of the Rényi entropy of the normalized sums of IID random variables, Bounds on coarsening rates for the Lifschitz-Slyozov-Wagner equation, Local limit theorems in free probability theory, From Boltzmann to random matrices and beyond, Complete monotonicity of the entropy in the central limit theorem for gamma and inverse Gaussian distributions, Gaussian mixtures: entropy and geometric inequalities, \(K\)-averaging agent-based model: propagation of chaos and convergence to equilibrium, Convergence of Markov chains in information divergence, A geometric property of the sample mean and residuals, Maximal correlation and monotonicity of free entropy and of Stein discrepancy, A reverse entropy power inequality for log-concave random vectors, Majorization and Rényi entropy inequalities via Sperner theory, Existence of Stein kernels under a spectral gap, and discrepancy bounds, Convergence and asymptotic approximations to universal distributions in probability, An information-theoretic proof of a finite de Finetti theorem, Generating monotone quantities for the heat equation, Sometimes size does not matter, Fractional free convolution powers
Cites Work
- Unnamed Item
- Unnamed Item
- Entropy production by block variable summation and central limit theorems
- Entropy and the central limit theorem
- Proof of an entropy conjecture of Wehrl
- Entropy jumps in the presence of a spectral gap
- Some inequalities satisfied by the quantities of information of Fisher and Shannon