Entropy and the central limit theorem
From MaRDI portal
Recommendations
Cited in
(only showing first 100 items - show all)- Entropy jumps in the presence of a spectral gap
- On entropy production for controlled Markovian evolution
- Complete monotonicity of the entropy in the central limit theorem for gamma and inverse Gaussian distributions
- Entropy inequalities and the central limit theorem.
- Central limit theorem and convergence to stable laws in Mallows distance
- Convergence of Differential Entropies
- Superadditivity of Fisher's information and logarithmic Sobolev inequalities
- Strict entropy production bounds and stability of the rate of convergence to equilibrium for the Boltzmann equation
- Information functionals with applications to random walk and statistics
- Central limit theorems for stochastic processes under random entropy conditions
- Convergence and asymptotic approximations to universal distributions in probability
- Notion of information and independent component analysis.
- scientific article; zbMATH DE number 4034753 (Why is no real title available?)
- Entropy production by block variable summation and central limit theorems
- scientific article; zbMATH DE number 2034513 (Why is no real title available?)
- The CLT in high dimensions: quantitative bounds via martingale embedding
- Entropy, the central limit theorem and the algebra of the canonical commutation relation
- scientific article; zbMATH DE number 2131215 (Why is no real title available?)
- Generalization of the Kullback-Leibler divergence in the Tsallis statistics
- Autour de l'inégalité de Brunn-Minkowski
- Fisher information and convergence to stable laws
- Differential entropy and dynamics of uncertainty
- Local limit theorems for multiplicative free convolutions
- Fisher information and the fourth moment theorem
- Berry-Esseen bounds in the entropic central limit theorem
- Information inequalities and a dependent Central Limit Theorem
- An integral representation of the relative entropy
- Solution of Shannon’s problem on the monotonicity of entropy
- Entropy production estimates for Boltzmann equations with physically realistic collision kernels
- Rate of convergence and Edgeworth-type expansion in the entropic central limit theorem
- Convergence to stable laws in relative entropy
- Asymptotic expansions in the CLT in free probability
- The entropic Erdős-Kac limit theorem
- Strong log-concavity is preserved by convolution
- Sometimes size does not matter
- Phase space gradient of dissipated work and information: A role of relative Fisher information
- Log-concavity and the maximum entropy property of the Poisson distribution
- Entropic approach to E. Rio's central limit theorem for \(W_2\) transport distance
- Log-concavity, ultra-log-concavity, and a maximum entropy property of discrete compound Poisson measures
- On convergence properties of Shannon entropy
- Convergence of Markov chains in information divergence
- Geometric inequalities from phase space translations
- Fisher information estimates for Boltzmann's collision operator
- scientific article; zbMATH DE number 1465044 (Why is no real title available?)
- The fractional Fisher information and the central limit theorem for stable laws
- Negentropy as a function of cumulants
- A note on a local limit theorem for Wiener space valued random variables
- Partial information reference priors: Derivation and interpretations
- Quantitative CLTs on a Gaussian space: a survey of recent developments
- An invariance principle under the total variation distance
- Log-concavity and strong log-concavity: a review
- Convergence in distribution norms in the CLT for non identical distributed random variables
- Probability interference in expected utility theory
- Local limit theorems in free probability theory
- On Shannon's formula and Hartley's rule: beyond the mathematical coincidence
- The analogues of entropy and of Fisher's information measure in free probability theory. I
- Rényi divergence and the central limit theorem
- Optimality and sub-optimality of PCA. I: Spiked random matrix models
- The convexification effect of Minkowski summation
- From information scaling of natural images to regimes of statistical models
- Entropy and the fourth moment phenomenon
- Variational inequalities for arbitrary multivariate distributions
- The convergence of the Rényi entropy of the normalized sums of IID random variables
- Entropy inequalities for stable densities and strengthened central limit theorems
- Prohorov-type local limit theorems on abstract Wiener spaces
- An extension of entropy power inequality for dependent random variables
- Local limit theorems for densities in Orlicz spaces
- Entropy-based test for generalised Gaussian distributions
- Asymptotic behavior of Rényi entropy in the central limit theorem
- A comment on rates of convergence for density function in extreme value theory and Rényi entropy
- An elementary proof of the inequality \(\chi \leq \chi^\ast\) for conditional free entropy
- An informatic approach to a long memory stationary process
- Relations between information and estimation in the presence of feedback
- An elementary approach to free entropy theory for convex potentials
- Maximum smoothed likelihood density estimation
- Entropy and random vectors
- A stroll along the gamma
- Non-uniform bounds and Edgeworth expansions in self-normalized limit theorems
- On the convergence of Shannon entropy of distribution functions in the max domain of attraction of max-stable laws
- scientific article; zbMATH DE number 7796494 (Why is no real title available?)
- Larry Brown's contributions to parametric inference, decision theory and foundations: a survey
- Tropical Gaussians: a brief survey
- Centralizers and entropy
- On the equivalence of statistical distances for isotropic convex measures
- An information-theoretic proof of a finite de Finetti theorem
- A lower bound on the relative entropy with respect to a symmetric probability
- The entropy power inequality with quantum conditioning
- A historical perspective on Schützenberger-Pinsker inequalities
- On the entropy and information of Gaussian mixtures
- Information measures in perspective
- Information in Probability: Another Information-Theoretic Proof of a Finite de Finetti Theorem
- Entropy power inequalities for qudits
- Log-Hessian and deviation bounds for Markov semi-groups, and regularization effect in \(\mathbb{L}^1 \)
- Information-theoretic convergence of extreme values to the Gumbel distribution
- Limiting properties of some measures of information
- Gaussian optimizers for entropic inequalities in quantum information
- Shannon's monotonicity problem for free and classical entropy
- Entropy production per site in (nonreversible) spin-flip processes.
- Further investigations of Rényi entropy power inequalities and an entropic characterization of \(s\)-concave densities
- Approximate discrete entropy monotonicity for log-concave sums
This page was built for publication: Entropy and the central limit theorem
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q1080263)