Entropy and the central limit theorem
From MaRDI portal
Publication:1080263
DOI10.1214/aop/1176992632zbMath0599.60024OpenAlexW2035364928MaRDI QIDQ1080263
Publication date: 1986
Published in: The Annals of Probability (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1214/aop/1176992632
Central limit and other weak theorems (60F05) Measures of information, entropy (94A17) Statistical aspects of information-theoretic topics (62B10)
Related Items (83)
Entropy inequalities and the central limit theorem. ⋮ Partial information reference priors: Derivation and interpretations ⋮ On entropy production for controlled Markovian evolution ⋮ On Shannon's formula and Hartley's rule: beyond the mathematical coincidence ⋮ Fisher information and the fourth moment theorem ⋮ The entropy power inequality with quantum conditioning ⋮ Quantitative clts on a gaussian space: a survey of recent developments ⋮ Prohorov-type local limit theorems on abstract Wiener spaces ⋮ The fractional Fisher information and the central limit theorem for stable laws ⋮ Entropy-based test for generalised Gaussian distributions ⋮ Entropy production estimates for Boltzmann equations with physically realistic collision kernels ⋮ Solution of Shannon’s problem on the monotonicity of entropy ⋮ Rate of convergence and Edgeworth-type expansion in the entropic central limit theorem ⋮ An elementary approach to free entropy theory for convex potentials ⋮ Convergence in distribution norms in the CLT for non identical distributed random variables ⋮ Tropical Gaussians: a brief survey ⋮ Convergence to stable laws in relative entropy ⋮ Asymptotic expansions in the CLT in free probability ⋮ A stroll along the gamma ⋮ An extension of entropy power inequality for dependent random variables ⋮ Log-concavity, ultra-log-concavity, and a maximum entropy property of discrete compound Poisson measures ⋮ On the equivalence of statistical distances for isotropic convex measures ⋮ Log-concavity and the maximum entropy property of the Poisson distribution ⋮ Gaussian optimizers for entropic inequalities in quantum information ⋮ Fisher information and convergence to stable laws ⋮ Berry-Esseen bounds in the entropic central limit theorem ⋮ A comment on rates of convergence for density function in extreme value theory and Rényi entropy ⋮ From information scaling of natural images to regimes of statistical models ⋮ Information in Probability: Another Information-Theoretic Proof of a Finite de Finetti Theorem ⋮ An informatic approach to a long memory stationary process ⋮ An integral representation of the relative entropy ⋮ Log-Hessian and deviation bounds for Markov semi-groups, and regularization effect in \(\mathbb{L}^1 \) ⋮ Entropy and the discrete central limit theorem ⋮ Larry Brown's contributions to parametric inference, decision theory and foundations: a survey ⋮ A historical perspective on Schützenberger-Pinsker inequalities ⋮ Geometric inequalities from phase space translations ⋮ The entropic Erdős-Kac limit theorem ⋮ Information-theoretic convergence of extreme values to the Gumbel distribution ⋮ Generalization of the Kullback-Leibler divergence in the Tsallis statistics ⋮ Further Investigations of Rényi Entropy Power Inequalities and an Entropic Characterization of s-Concave Densities ⋮ Two Remarks on Generalized Entropy Power Inequalities ⋮ Entropy jumps in the presence of a spectral gap ⋮ The CLT in high dimensions: quantitative bounds via martingale embedding ⋮ Local limit theorems for multiplicative free convolutions ⋮ Unnamed Item ⋮ Shannon's monotonicity problem for free and classical entropy ⋮ Autour de l'inégalité de Brunn-Minkowski ⋮ Rényi divergence and the central limit theorem ⋮ Log-concavity and strong log-concavity: a review ⋮ Entropy inequalities for stable densities and strengthened central limit theorems ⋮ Entropy, the central limit theorem and the algebra of the canonical commutation relation ⋮ Phase space gradient of dissipated work and information: A role of relative Fisher information ⋮ The convexification effect of Minkowski summation ⋮ Strict entropy production bounds and stability of the rate of convergence to equilibrium for the Boltzmann equation ⋮ Entropy and the fourth moment phenomenon ⋮ The convergence of the Rényi entropy of the normalized sums of IID random variables ⋮ Differential entropy and dynamics of uncertainty ⋮ Local limit theorems in free probability theory ⋮ Limiting properties of some measures of information ⋮ Local limit theorems for densities in Orlicz spaces ⋮ A note on a local limit theorem for Wiener space valued random variables ⋮ On convergence properties of Shannon entropy ⋮ Complete monotonicity of the entropy in the central limit theorem for gamma and inverse Gaussian distributions ⋮ Optimality and sub-optimality of PCA. I: Spiked random matrix models ⋮ Probability interference in expected utility theory ⋮ Entropy power inequalities for qudits ⋮ Convergence of Markov chains in information divergence ⋮ Variational inequalities for arbitrary multivariate distributions ⋮ Fisher information estimates for Boltzmann's collision operator ⋮ Information functionals with applications to random walk and statistics ⋮ Strong Log-concavity is Preserved by Convolution ⋮ Convergence and asymptotic approximations to universal distributions in probability ⋮ An information-theoretic proof of a finite de Finetti theorem ⋮ Notion of information and independent component analysis. ⋮ Relations Between Information and Estimation in the Presence of Feedback ⋮ Maximum smoothed likelihood density estimation ⋮ Entropy production per site in (nonreversible) spin-flip processes. ⋮ The analogues of entropy and of Fisher's information measure in free probability theory. I ⋮ An invariance principle under the total variation distance ⋮ Sometimes size does not matter ⋮ Superadditivity of Fisher's information and logarithmic Sobolev inequalities ⋮ Entropy production by block variable summation and central limit theorems ⋮ Central limit theorem and convergence to stable laws in Mallows distance
This page was built for publication: Entropy and the central limit theorem