Solution of Shannon’s problem on the monotonicity of entropy
DOI10.1090/S0894-0347-04-00459-XzbMATH Open1062.94006OpenAlexW1544500240WikidataQ56070029 ScholiaQ56070029MaRDI QIDQ4821035FDOQ4821035
Authors: Assaf Naor, S. Artstein-Avidan, Keith Ball, Franck Barthe
Publication date: 7 October 2004
Published in: Journal of the American Mathematical Society (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1090/s0894-0347-04-00459-x
Recommendations
- A free analogue of Shannon's problem on monotonicity of entropy
- Shannon's monotonicity problem for free and classical entropy
- scientific article; zbMATH DE number 61024
- scientific article; zbMATH DE number 3997696
- A stability result concerning the Shannon entropy
- Shannon entropy: axiomatic characterization and application
- Some derivations of the Shannon entropy
- Entropy and monotonicity
- Shannon entropy reinterpreted
Statistical aspects of information-theoretic topics (62B10) Measures of information, entropy (94A17)
Cites Work
- Proof of an entropy conjecture of Wehrl
- Title not available (Why is that?)
- Title not available (Why is that?)
- Entropy and the central limit theorem
- Some inequalities satisfied by the quantities of information of Fisher and Shannon
- Entropy production by block variable summation and central limit theorems
- Entropy jumps in the presence of a spectral gap
Cited In (65)
- Dynamical Gibbs variational principles for irreversible interacting particle systems with applications to attractor properties
- Monotonicity of the logarithmic energy for random matrices
- On properties of random binary contingency tables with non-uniform margin
- Information in Probability: Another Information-Theoretic Proof of a Finite de Finetti Theorem
- Approximate discrete entropy monotonicity for log-concave sums
- Adjoint Brascamp-Lieb inequalities
- Convex geometry and its connections to harmonic analysis, functional analysis and probability theory
- Entropy and the discrete central limit theorem
- Volumes of subset Minkowski sums and the Lyusternik region
- Maximum Entropy for Sums of Symmetric and Bounded Random Variables: A Short Derivation
- Partial monotonicity of entropy revisited
- Reverse Brunn-Minkowski and reverse entropy power inequalities for convex measures
- Gaussian mixtures: entropy and geometric inequalities
- Title not available (Why is that?)
- The convexification effect of Minkowski summation
- Monotonic Decrease of the Non-Gaussianness of the Sum of Independent Random Variables: A Simple Proof
- Bounds on coarsening rates for the Lifschitz-Slyozov-Wagner equation
- Contribution to the theory of Pitman estimators
- A local proof of the dimensional Prékopa's theorem
- Gaussian approximations in high dimensional estimation
- From Boltzmann to random matrices and beyond
- Sometimes size does not matter
- A free analogue of Shannon's problem on monotonicity of entropy
- Two Remarks on Generalized Entropy Power Inequalities
- The convergence of the Rényi entropy of the normalized sums of IID random variables
- Existence of Stein kernels under a spectral gap, and discrepancy bounds
- Complete monotonicity of the entropy in the central limit theorem for gamma and inverse Gaussian distributions
- A proof of the Shepp-Olkin entropy concavity conjecture
- Semicircularity, gaussianity and monotonicity of entropy
- The fractional Fisher information and the central limit theorem for stable laws
- Title not available (Why is that?)
- Maximal correlation and monotonicity of free entropy and of Stein discrepancy
- Higher-order Stein kernels for Gaussian approximation
- A reverse entropy power inequality for log-concave random vectors
- A geometric property of the sample mean and residuals
- Gaussian optimizers for entropic inequalities in quantum information
- Shannon's monotonicity problem for free and classical entropy
- Natural selection as coarsening
- \(K\)-averaging agent-based model: propagation of chaos and convergence to equilibrium
- Majorization and Rényi entropy inequalities via Sperner theory
- Rényi divergence and the central limit theorem
- On Berndtsson's generalization of Prékopa's theorem
- Entropy and the fourth moment phenomenon
- Title not available (Why is that?)
- Log-concavity and strong log-concavity: a review
- Entropy inequalities for stable densities and strengthened central limit theorems
- Convergence and asymptotic approximations to universal distributions in probability
- Fractional free convolution powers
- Log-Hessian and deviation bounds for Markov semi-groups, and regularization effect in \(\mathbb{L}^1 \)
- Rate of convergence and Edgeworth-type expansion in the entropic central limit theorem
- Monotonicity of entropy and Fisher information: a quick proof via maximal correlation
- A stability result concerning the Shannon entropy
- An information-theoretic proof of a finite de Finetti theorem
- Convergence of Markov chains in information divergence
- Maximizing the entropy of a sum of independent bounded random variables
- Fisher information and the central limit theorem
- Bounds on the Poincaré constant for convolution measures
- Stein's density method for multivariate continuous distributions
- Generating monotone quantities for the heat equation
- Algorithm and application of solving entropy equation
- Entropy and monotonicity in artificial intelligence
- Sumset and Inverse Sumset Theory for Shannon Entropy
- Log-concavity, ultra-log-concavity, and a maximum entropy property of discrete compound Poisson measures
- Local limit theorems in free probability theory
- The (B) conjecture for the Gaussian measure of dilates of symmetric convex sets and related problems
This page was built for publication: Solution of Shannon’s problem on the monotonicity of entropy
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q4821035)