An Information-Theoretic Proof of the Central Limit Theorem with Lindeberg Conditions

From MaRDI portal
Publication:3276197

DOI10.1137/1104028zbMath0097.13103OpenAlexW2018762986MaRDI QIDQ3276197

Yu. V. Linnik

Publication date: 1960

Published in: Theory of Probability & Its Applications (Search for Journal in Brave)

Full work available at URL: https://doi.org/10.1137/1104028




Related Items (44)

Entropy inequalities and the central limit theorem.Prohorov-type local limit theorems on abstract Wiener spacesAn information-theoretic proof of Nash's inequalityThe fractional Fisher information and the central limit theorem for stable lawsA Minkowski theory of observation: Application to uncertainty and fuzzinessEntropy production estimates for Boltzmann equations with physically realistic collision kernelsThe information theoretic proof of Kac's theoremRate of convergence and Edgeworth-type expansion in the entropic central limit theoremStatistical modelling of higher-order correlations in pools of neural activityConvergence to stable laws in relative entropyFisher information and convergence to stable lawsBerry-Esseen bounds in the entropic central limit theoremA comment on rates of convergence for density function in extreme value theory and Rényi entropyInformation in Probability: Another Information-Theoretic Proof of a Finite de Finetti TheoremRényi entropies and nonlinear diffusion equationsOn the generalization of the Boltzmann H-theorem for a spatially homogeneous Maxwell gasEntropy and the discrete central limit theoremPoincaré-type inequalities for stable densitiesDynamical Gibbs variational principles for irreversible interacting particle systems with applications to attractor propertiesEntropy jumps in the presence of a spectral gapAutour de l'inégalité de Brunn-MinkowskiHeat equation and convolution inequalitiesDirect approach to quantum extensions of Fisher informationRényi divergence and the central limit theoremLog-concavity and strong log-concavity: a reviewEntropy inequalities for stable densities and strengthened central limit theoremsLyapunov functionals for a Maxwell gasStrict entropy production bounds and stability of the rate of convergence to equilibrium for the Boltzmann equationEntropy and the fourth moment phenomenonNew a priori estimates for the spatially homogeneous Boltzmann equationZero variance Markov chain Monte Carlo for Bayesian estimatorsThe convergence of the Rényi entropy of the normalized sums of IID random variablesDifferential entropy and dynamics of uncertaintyLimiting properties of some measures of informationA note on a local limit theorem for Wiener space valued random variablesOn the fractional Fisher information with applications to a hyperbolic-parabolic system of chemotaxisFrom Boltzmann to random matrices and beyondProbability interference in expected utility theoryReaching the best possible rate of convergence to equilibrium for solutions of Kac's equation via central limit theoremFisher information estimates for Boltzmann's collision operatorThe information-theoretic meaning of Gagliardo-Nirenberg type inequalitiesInformation functionals with applications to random walk and statisticsGenerating monotone quantities for the heat equationEntropy production by block variable summation and central limit theorems




This page was built for publication: An Information-Theoretic Proof of the Central Limit Theorem with Lindeberg Conditions