Solution of Shannon’s problem on the monotonicity of entropy

From MaRDI portal
Revision as of 01:42, 8 February 2024 by Import240129110113 (talk | contribs) (Created automatically from import240129110113)
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)

Publication:4821035

DOI10.1090/S0894-0347-04-00459-XzbMath1062.94006OpenAlexW1544500240WikidataQ56070029 ScholiaQ56070029MaRDI QIDQ4821035

Assaf Naor, Franck Barthe, Keith M. Ball, Shiri Artstein-Avidan

Publication date: 7 October 2004

Published in: Journal of the American Mathematical Society (Search for Journal in Brave)

Full work available at URL: https://doi.org/10.1090/s0894-0347-04-00459-x




Related Items (50)

Gaussian approximations in high dimensional estimationThe (B) conjecture for the Gaussian measure of dilates of symmetric convex sets and related problemsThe fractional Fisher information and the central limit theorem for stable lawsA free analogue of Shannon's problem on monotonicity of entropyRate of convergence and Edgeworth-type expansion in the entropic central limit theoremNatural selection as coarseningBounds on the Poincaré constant for convolution measuresLog-concavity, ultra-log-concavity, and a maximum entropy property of discrete compound Poisson measuresGaussian optimizers for entropic inequalities in quantum informationInformation in Probability: Another Information-Theoretic Proof of a Finite de Finetti TheoremReverse Brunn-Minkowski and reverse entropy power inequalities for convex measuresLog-Hessian and deviation bounds for Markov semi-groups, and regularization effect in \(\mathbb{L}^1 \)Convex geometry and its connections to harmonic analysis, functional analysis and probability theoryEntropy and the discrete central limit theoremVolumes of subset Minkowski sums and the Lyusternik regionDynamical Gibbs variational principles for irreversible interacting particle systems with applications to attractor propertiesStein's density method for multivariate continuous distributionsTwo Remarks on Generalized Entropy Power InequalitiesEntropy and monotonicity in artificial intelligenceUnnamed ItemShannon's monotonicity problem for free and classical entropyHigher-order Stein kernels for Gaussian approximationRényi divergence and the central limit theoremLog-concavity and strong log-concavity: a reviewEntropy inequalities for stable densities and strengthened central limit theoremsFisher information and the central limit theoremThe convexification effect of Minkowski summationA local proof of the dimensional Prékopa's theoremEntropy and the fourth moment phenomenonContribution to the theory of Pitman estimatorsOn Berndtsson's generalization of Prékopa's theoremSumset and Inverse Sumset Theory for Shannon EntropyThe convergence of the Rényi entropy of the normalized sums of IID random variablesBounds on coarsening rates for the Lifschitz-Slyozov-Wagner equationLocal limit theorems in free probability theoryFrom Boltzmann to random matrices and beyondComplete monotonicity of the entropy in the central limit theorem for gamma and inverse Gaussian distributionsGaussian mixtures: entropy and geometric inequalities\(K\)-averaging agent-based model: propagation of chaos and convergence to equilibriumConvergence of Markov chains in information divergenceA geometric property of the sample mean and residualsMaximal correlation and monotonicity of free entropy and of Stein discrepancyA reverse entropy power inequality for log-concave random vectorsMajorization and Rényi entropy inequalities via Sperner theoryExistence of Stein kernels under a spectral gap, and discrepancy boundsConvergence and asymptotic approximations to universal distributions in probabilityAn information-theoretic proof of a finite de Finetti theoremGenerating monotone quantities for the heat equationSometimes size does not matterFractional free convolution powers



Cites Work




This page was built for publication: Solution of Shannon’s problem on the monotonicity of entropy