Entropy and the discrete central limit theorem (Q6123274)
From MaRDI portal
scientific article; zbMATH DE number 7812493
Language | Label | Description | Also known as |
---|---|---|---|
English | Entropy and the discrete central limit theorem |
scientific article; zbMATH DE number 7812493 |
Statements
Entropy and the discrete central limit theorem (English)
0 references
4 March 2024
0 references
Let \(X_1,X_2,\ldots\) be IID random variables taking values in a lattice of span \(h\), and with finite variance \(\sigma^2\). Letting \(S_n=X_1+\cdots+X_n\) and \(H(S_n)\) denote its entropy, the authors show that \[ \lim_{n\to\infty}\left[H(S_n)-\log\frac{\sqrt{n}}{h}\right]=\frac{1}{2}\log(2\pi e\sigma^2)\,, \] the entropy of a Gaussian random variable. They show that this is equivalent to the relative entropy between the standardised version of \(S_n\) and a suitably discretised Gaussian measure converging to zero as \(n\to\infty\). Using Pinsker's inequality, this implies a strong version of the central limit theorem for \(S_n\): convergence of a standardised version of \(S_n\) to a standard Gaussian in the total variation norm. The proofs make use of information theoretic tools.
0 references
central limit theorem
0 references
entropy
0 references
Fisher information
0 references
relative entropy
0 references
Bernoulli part decomposition
0 references
lattice distribution
0 references
convolution inequality
0 references
0 references
0 references