Entropy and the central limit theorem

From MaRDI portal
Publication:1080263

DOI10.1214/aop/1176992632zbMath0599.60024OpenAlexW2035364928MaRDI QIDQ1080263

Andrew R. Barron

Publication date: 1986

Published in: The Annals of Probability (Search for Journal in Brave)

Full work available at URL: https://doi.org/10.1214/aop/1176992632




Related Items (83)

Entropy inequalities and the central limit theorem.Partial information reference priors: Derivation and interpretationsOn entropy production for controlled Markovian evolutionOn Shannon's formula and Hartley's rule: beyond the mathematical coincidenceFisher information and the fourth moment theoremThe entropy power inequality with quantum conditioningQuantitative clts on a gaussian space: a survey of recent developmentsProhorov-type local limit theorems on abstract Wiener spacesThe fractional Fisher information and the central limit theorem for stable lawsEntropy-based test for generalised Gaussian distributionsEntropy production estimates for Boltzmann equations with physically realistic collision kernelsSolution of Shannon’s problem on the monotonicity of entropyRate of convergence and Edgeworth-type expansion in the entropic central limit theoremAn elementary approach to free entropy theory for convex potentialsConvergence in distribution norms in the CLT for non identical distributed random variablesTropical Gaussians: a brief surveyConvergence to stable laws in relative entropyAsymptotic expansions in the CLT in free probabilityA stroll along the gammaAn extension of entropy power inequality for dependent random variablesLog-concavity, ultra-log-concavity, and a maximum entropy property of discrete compound Poisson measuresOn the equivalence of statistical distances for isotropic convex measuresLog-concavity and the maximum entropy property of the Poisson distributionGaussian optimizers for entropic inequalities in quantum informationFisher information and convergence to stable lawsBerry-Esseen bounds in the entropic central limit theoremA comment on rates of convergence for density function in extreme value theory and Rényi entropyFrom information scaling of natural images to regimes of statistical modelsInformation in Probability: Another Information-Theoretic Proof of a Finite de Finetti TheoremAn informatic approach to a long memory stationary processAn integral representation of the relative entropyLog-Hessian and deviation bounds for Markov semi-groups, and regularization effect in \(\mathbb{L}^1 \)Entropy and the discrete central limit theoremLarry Brown's contributions to parametric inference, decision theory and foundations: a surveyA historical perspective on Schützenberger-Pinsker inequalitiesGeometric inequalities from phase space translationsThe entropic Erdős-Kac limit theoremInformation-theoretic convergence of extreme values to the Gumbel distributionGeneralization of the Kullback-Leibler divergence in the Tsallis statisticsFurther Investigations of Rényi Entropy Power Inequalities and an Entropic Characterization of s-Concave DensitiesTwo Remarks on Generalized Entropy Power InequalitiesEntropy jumps in the presence of a spectral gapThe CLT in high dimensions: quantitative bounds via martingale embeddingLocal limit theorems for multiplicative free convolutionsUnnamed ItemShannon's monotonicity problem for free and classical entropyAutour de l'inégalité de Brunn-MinkowskiRényi divergence and the central limit theoremLog-concavity and strong log-concavity: a reviewEntropy inequalities for stable densities and strengthened central limit theoremsEntropy, the central limit theorem and the algebra of the canonical commutation relationPhase space gradient of dissipated work and information: A role of relative Fisher informationThe convexification effect of Minkowski summationStrict entropy production bounds and stability of the rate of convergence to equilibrium for the Boltzmann equationEntropy and the fourth moment phenomenonThe convergence of the Rényi entropy of the normalized sums of IID random variablesDifferential entropy and dynamics of uncertaintyLocal limit theorems in free probability theoryLimiting properties of some measures of informationLocal limit theorems for densities in Orlicz spacesA note on a local limit theorem for Wiener space valued random variablesOn convergence properties of Shannon entropyComplete monotonicity of the entropy in the central limit theorem for gamma and inverse Gaussian distributionsOptimality and sub-optimality of PCA. I: Spiked random matrix modelsProbability interference in expected utility theoryEntropy power inequalities for quditsConvergence of Markov chains in information divergenceVariational inequalities for arbitrary multivariate distributionsFisher information estimates for Boltzmann's collision operatorInformation functionals with applications to random walk and statisticsStrong Log-concavity is Preserved by ConvolutionConvergence and asymptotic approximations to universal distributions in probabilityAn information-theoretic proof of a finite de Finetti theoremNotion of information and independent component analysis.Relations Between Information and Estimation in the Presence of FeedbackMaximum smoothed likelihood density estimationEntropy production per site in (nonreversible) spin-flip processes.The analogues of entropy and of Fisher's information measure in free probability theory. IAn invariance principle under the total variation distanceSometimes size does not matterSuperadditivity of Fisher's information and logarithmic Sobolev inequalitiesEntropy production by block variable summation and central limit theoremsCentral limit theorem and convergence to stable laws in Mallows distance




This page was built for publication: Entropy and the central limit theorem