Fisher information and the central limit theorem
From MaRDI portal
Publication:2249581
DOI10.1007/S00440-013-0500-5zbMATH Open1372.60018arXiv1204.6650OpenAlexW2123901758MaRDI QIDQ2249581FDOQ2249581
S. G. Bobkov, Friedrich Götze, Gennadiy P. Chistyakov
Publication date: 2 July 2014
Published in: Zeitschrift für Wahrscheinlichkeitstheorie und Verwandte Gebiete (Search for Journal in Brave)
Abstract: An Edgeworth-type expansion is established for the relative Fisher information distance to the class of normal distributions of sums of i.i.d. random variables, satisfying moment conditions. The validity of the central limit theorem is studied via properties of the Fisher information along convolutions.
Full work available at URL: https://arxiv.org/abs/1204.6650
Cites Work
- Title not available (Why is that?)
- Title not available (Why is that?)
- Title not available (Why is that?)
- Title not available (Why is that?)
- Title not available (Why is that?)
- Fisher information inequalities and the central limit theorem
- Title not available (Why is that?)
- Solution of Shannon’s problem on the monotonicity of entropy
- Weighted Csiszár-Kullback-Pinsker inequalities and applications to transportation inequalities
- Large deviations and isoperimetry over convex probability measures with heavy tails
- Some inequalities satisfied by the quantities of information of Fisher and Shannon
- The convolution inequality for entropy powers
- On the rate of convergence in the entropic central limit theorem
- Non-uniform bounds in local limit theorems in case of fractional moments. II
- Rate of convergence and Edgeworth-type expansion in the entropic central limit theorem
- Title not available (Why is that?)
- Non-uniform bounds in local limit theorems in case of fractional moments. I
- Entropy jumps in the presence of a spectral gap
- Inequalities for characteristic functions involving Fisher information
Cited In (26)
- Distance in variation and a Fisher-type information
- Central limit theorem in uniform metrics for generalized Kac equations
- Entropic isoperimetric inequalities
- Local Limit Theorems for Smoothed Bernoulli and Other Convolutions
- Statistical meaning of Carlen's superadditivity of the Fisher information
- On the time-dependent Fisher information of a density function
- An invariance principle under the total variation distance
- Non-uniform bounds and Edgeworth expansions in self-normalized limit theorems
- Fisher Information With Respect to Cumulants
- An Information-Theoretic Proof of the Central Limit Theorem with Lindeberg Conditions
- The fractional Fisher information and the central limit theorem for stable laws
- Poincaré-type inequalities for stable densities
- Fisher information inequalities and the central limit theorem
- Rates of Fisher information convergence in the central limit theorem for nonlinear statistics
- Higher-order Stein kernels for Gaussian approximation
- Rényi divergence and the central limit theorem
- Title not available (Why is that?)
- Score functions, generalized relative Fisher information and applications
- Entropy inequalities for stable densities and strengthened central limit theorems
- Convergence and asymptotic approximations to universal distributions in probability
- Stein's method, logarithmic Sobolev and transport inequalities
- Regularity of linear and polynomial images of Skorohod differentiable measures
- Title not available (Why is that?)
- Upper bounds for Fisher information
- On the entropy and information of Gaussian mixtures
- Approximation of Markov semigroups in total variation distance under an irregular setting: an application to the CIR process
This page was built for publication: Fisher information and the central limit theorem
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2249581)