Pages that link to "Item:Q1080263"
From MaRDI portal
The following pages link to Entropy and the central limit theorem (Q1080263):
Displaying 50 items.
- On Shannon's formula and Hartley's rule: beyond the mathematical coincidence (Q296309) (← links)
- Fisher information and the fourth moment theorem (Q297460) (← links)
- The fractional Fisher information and the central limit theorem for stable laws (Q310486) (← links)
- Rate of convergence and Edgeworth-type expansion in the entropic central limit theorem (Q359672) (← links)
- Convergence to stable laws in relative entropy (Q376268) (← links)
- Asymptotic expansions in the CLT in free probability (Q377516) (← links)
- Log-concavity, ultra-log-concavity, and a maximum entropy property of discrete compound Poisson measures (Q385153) (← links)
- Fisher information and convergence to stable laws (Q396022) (← links)
- Berry-Esseen bounds in the entropic central limit theorem (Q398769) (← links)
- An integral representation of the relative entropy (Q406130) (← links)
- Local limit theorems for multiplicative free convolutions (Q457655) (← links)
- Log-concavity and strong log-concavity: a review (Q485901) (← links)
- Entropy inequalities for stable densities and strengthened central limit theorems (Q505562) (← links)
- A note on a local limit theorem for Wiener space valued random variables (Q726730) (← links)
- On convergence properties of Shannon entropy (Q734295) (← links)
- Convergence and asymptotic approximations to universal distributions in probability (Q776392) (← links)
- Notion of information and independent component analysis. (Q778564) (← links)
- Superadditivity of Fisher's information and logarithmic Sobolev inequalities (Q808512) (← links)
- Entropy production by block variable summation and central limit theorems (Q810998) (← links)
- Central limit theorem and convergence to stable laws in Mallows distance (Q817972) (← links)
- Log-concavity and the maximum entropy property of the Poisson distribution (Q885265) (← links)
- The entropic Erdős-Kac limit theorem (Q904709) (← links)
- Generalization of the Kullback-Leibler divergence in the Tsallis statistics (Q905978) (← links)
- The convergence of the Rényi entropy of the normalized sums of IID random variables (Q984005) (← links)
- Local limit theorems in free probability theory (Q989184) (← links)
- Complete monotonicity of the entropy in the central limit theorem for gamma and inverse Gaussian distributions (Q1003435) (← links)
- Convergence of Markov chains in information divergence (Q1014048) (← links)
- Entropy, the central limit theorem and the algebra of the canonical commutation relation (Q1196676) (← links)
- Strict entropy production bounds and stability of the rate of convergence to equilibrium for the Boltzmann equation (Q1203219) (← links)
- Variational inequalities for arbitrary multivariate distributions (Q1275410) (← links)
- Fisher information estimates for Boltzmann's collision operator (Q1277409) (← links)
- The analogues of entropy and of Fisher's information measure in free probability theory. I (Q1308452) (← links)
- Entropy jumps in the presence of a spectral gap (Q1409333) (← links)
- Autour de l'inégalité de Brunn-Minkowski (Q1432086) (← links)
- Entropy production per site in (nonreversible) spin-flip processes. (Q1593400) (← links)
- Prohorov-type local limit theorems on abstract Wiener spaces (Q1635744) (← links)
- Convergence in distribution norms in the CLT for non identical distributed random variables (Q1663863) (← links)
- Rényi divergence and the central limit theorem (Q1731889) (← links)
- The convexification effect of Minkowski summation (Q1755914) (← links)
- Optimality and sub-optimality of PCA. I: Spiked random matrix models (Q1800806) (← links)
- Probability interference in expected utility theory (Q1800982) (← links)
- Entropy inequalities and the central limit theorem. (Q1877517) (← links)
- Partial information reference priors: Derivation and interpretations (Q1877838) (← links)
- Entropy production estimates for Boltzmann equations with physically realistic collision kernels (Q1896796) (← links)
- An information-theoretic proof of a finite de Finetti theorem (Q2078238) (← links)
- Sometimes size does not matter (Q2103363) (← links)
- Entropy-based test for generalised Gaussian distributions (Q2143021) (← links)
- Larry Brown's contributions to parametric inference, decision theory and foundations: a survey (Q2194579) (← links)
- The CLT in high dimensions: quantitative bounds via martingale embedding (Q2212600) (← links)
- Entropy and the fourth moment phenomenon (Q2253134) (← links)