Solution of Shannon’s problem on the monotonicity of entropy
From MaRDI portal
(Redirected from Publication:4821035)
Recommendations
- A free analogue of Shannon's problem on monotonicity of entropy
- Shannon's monotonicity problem for free and classical entropy
- scientific article; zbMATH DE number 61024
- scientific article; zbMATH DE number 3997696
- A stability result concerning the Shannon entropy
- Shannon entropy: axiomatic characterization and application
- Some derivations of the Shannon entropy
- Entropy and monotonicity
- Shannon entropy reinterpreted
Cites work
- scientific article; zbMATH DE number 3894218 (Why is no real title available?)
- scientific article; zbMATH DE number 3062467 (Why is no real title available?)
- Entropy and the central limit theorem
- Entropy jumps in the presence of a spectral gap
- Entropy production by block variable summation and central limit theorems
- Proof of an entropy conjecture of Wehrl
- Some inequalities satisfied by the quantities of information of Fisher and Shannon
Cited in
(65)- Maximum Entropy for Sums of Symmetric and Bounded Random Variables: A Short Derivation
- Reverse Brunn-Minkowski and reverse entropy power inequalities for convex measures
- Partial monotonicity of entropy revisited
- Gaussian mixtures: entropy and geometric inequalities
- Dynamical Gibbs variational principles for irreversible interacting particle systems with applications to attractor properties
- scientific article; zbMATH DE number 3848310 (Why is no real title available?)
- The convexification effect of Minkowski summation
- Monotonic Decrease of the Non-Gaussianness of the Sum of Independent Random Variables: A Simple Proof
- Bounds on coarsening rates for the Lifschitz-Slyozov-Wagner equation
- Contribution to the theory of Pitman estimators
- A local proof of the dimensional Prékopa's theorem
- Gaussian approximations in high dimensional estimation
- Monotonicity of the logarithmic energy for random matrices
- A free analogue of Shannon's problem on monotonicity of entropy
- Sometimes size does not matter
- The convergence of the Rényi entropy of the normalized sums of IID random variables
- From Boltzmann to random matrices and beyond
- Two Remarks on Generalized Entropy Power Inequalities
- On properties of random binary contingency tables with non-uniform margin
- Existence of Stein kernels under a spectral gap, and discrepancy bounds
- Complete monotonicity of the entropy in the central limit theorem for gamma and inverse Gaussian distributions
- A proof of the Shepp-Olkin entropy concavity conjecture
- Information in Probability: Another Information-Theoretic Proof of a Finite de Finetti Theorem
- The fractional Fisher information and the central limit theorem for stable laws
- Semicircularity, gaussianity and monotonicity of entropy
- scientific article; zbMATH DE number 3483917 (Why is no real title available?)
- Maximal correlation and monotonicity of free entropy and of Stein discrepancy
- A geometric property of the sample mean and residuals
- A reverse entropy power inequality for log-concave random vectors
- Higher-order Stein kernels for Gaussian approximation
- Natural selection as coarsening
- \(K\)-averaging agent-based model: propagation of chaos and convergence to equilibrium
- Gaussian optimizers for entropic inequalities in quantum information
- Majorization and Rényi entropy inequalities via Sperner theory
- Shannon's monotonicity problem for free and classical entropy
- Approximate discrete entropy monotonicity for log-concave sums
- Rényi divergence and the central limit theorem
- On Berndtsson's generalization of Prékopa's theorem
- Adjoint Brascamp-Lieb inequalities
- Entropy and the fourth moment phenomenon
- scientific article; zbMATH DE number 1574609 (Why is no real title available?)
- Log-concavity and strong log-concavity: a review
- Entropy inequalities for stable densities and strengthened central limit theorems
- Convergence and asymptotic approximations to universal distributions in probability
- Log-Hessian and deviation bounds for Markov semi-groups, and regularization effect in \(\mathbb{L}^1 \)
- Rate of convergence and Edgeworth-type expansion in the entropic central limit theorem
- Fractional free convolution powers
- Monotonicity of entropy and Fisher information: a quick proof via maximal correlation
- Convergence of Markov chains in information divergence
- An information-theoretic proof of a finite de Finetti theorem
- A stability result concerning the Shannon entropy
- Fisher information and the central limit theorem
- Maximizing the entropy of a sum of independent bounded random variables
- Bounds on the Poincaré constant for convolution measures
- Generating monotone quantities for the heat equation
- Stein's density method for multivariate continuous distributions
- Log-concavity, ultra-log-concavity, and a maximum entropy property of discrete compound Poisson measures
- Entropy and monotonicity in artificial intelligence
- Convex geometry and its connections to harmonic analysis, functional analysis and probability theory
- Local limit theorems in free probability theory
- Entropy and the discrete central limit theorem
- Volumes of subset Minkowski sums and the Lyusternik region
- The (B) conjecture for the Gaussian measure of dilates of symmetric convex sets and related problems
- Algorithm and application of solving entropy equation
- Sumset and Inverse Sumset Theory for Shannon Entropy
This page was built for publication: Solution of Shannon’s problem on the monotonicity of entropy
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q4821035)