An Information-Theoretic Proof of the Central Limit Theorem with Lindeberg Conditions
From MaRDI portal
Publication:3276197
DOI10.1137/1104028zbMath0097.13103OpenAlexW2018762986MaRDI QIDQ3276197
Publication date: 1960
Published in: Theory of Probability & Its Applications (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1137/1104028
Related Items (44)
Entropy inequalities and the central limit theorem. ⋮ Prohorov-type local limit theorems on abstract Wiener spaces ⋮ An information-theoretic proof of Nash's inequality ⋮ The fractional Fisher information and the central limit theorem for stable laws ⋮ A Minkowski theory of observation: Application to uncertainty and fuzziness ⋮ Entropy production estimates for Boltzmann equations with physically realistic collision kernels ⋮ The information theoretic proof of Kac's theorem ⋮ Rate of convergence and Edgeworth-type expansion in the entropic central limit theorem ⋮ Statistical modelling of higher-order correlations in pools of neural activity ⋮ Convergence to stable laws in relative entropy ⋮ Fisher information and convergence to stable laws ⋮ Berry-Esseen bounds in the entropic central limit theorem ⋮ A comment on rates of convergence for density function in extreme value theory and Rényi entropy ⋮ Information in Probability: Another Information-Theoretic Proof of a Finite de Finetti Theorem ⋮ Rényi entropies and nonlinear diffusion equations ⋮ On the generalization of the Boltzmann H-theorem for a spatially homogeneous Maxwell gas ⋮ Entropy and the discrete central limit theorem ⋮ Poincaré-type inequalities for stable densities ⋮ Dynamical Gibbs variational principles for irreversible interacting particle systems with applications to attractor properties ⋮ Entropy jumps in the presence of a spectral gap ⋮ Autour de l'inégalité de Brunn-Minkowski ⋮ Heat equation and convolution inequalities ⋮ Direct approach to quantum extensions of Fisher information ⋮ Rényi divergence and the central limit theorem ⋮ Log-concavity and strong log-concavity: a review ⋮ Entropy inequalities for stable densities and strengthened central limit theorems ⋮ Lyapunov functionals for a Maxwell gas ⋮ Strict entropy production bounds and stability of the rate of convergence to equilibrium for the Boltzmann equation ⋮ Entropy and the fourth moment phenomenon ⋮ New a priori estimates for the spatially homogeneous Boltzmann equation ⋮ Zero variance Markov chain Monte Carlo for Bayesian estimators ⋮ The convergence of the Rényi entropy of the normalized sums of IID random variables ⋮ Differential entropy and dynamics of uncertainty ⋮ Limiting properties of some measures of information ⋮ A note on a local limit theorem for Wiener space valued random variables ⋮ On the fractional Fisher information with applications to a hyperbolic-parabolic system of chemotaxis ⋮ From Boltzmann to random matrices and beyond ⋮ Probability interference in expected utility theory ⋮ Reaching the best possible rate of convergence to equilibrium for solutions of Kac's equation via central limit theorem ⋮ Fisher information estimates for Boltzmann's collision operator ⋮ The information-theoretic meaning of Gagliardo-Nirenberg type inequalities ⋮ Information functionals with applications to random walk and statistics ⋮ Generating monotone quantities for the heat equation ⋮ Entropy production by block variable summation and central limit theorems
This page was built for publication: An Information-Theoretic Proof of the Central Limit Theorem with Lindeberg Conditions