scientific article; zbMATH DE number 2131215
From MaRDI portal
Publication:3158591
Statistical aspects of information-theoretic topics (62B10) Central limit and other weak theorems (60F05) Information theory (general) (94A15) Research exposition (monographs, survey articles) pertaining to probability theory (60-02) Introductory exposition (textbooks, tutorial papers, etc.) pertaining to information and communication theory (94-01) Limit theorems in probability theory (60Fxx)
Recommendations
Cited in
(70)- Berry-Esseen bounds in the entropic central limit theorem
- Quasi-log concavity conjecture and its applications in statistics
- Theory of \(\phi\)-Jensen variance and its applications in higher education
- Central limit theorem and deformed exponentials
- Entropy and random vectors
- Notes on superadditivity of Wigner-Yanase-Dyson information
- Information functionals with applications to random walk and statistics
- Stability of Cramer’s Characterization of Normal Laws in Information Distances
- Fisher information and the fourth moment theorem
- Poisson approximation in \(\chi^2\) distance by the Stein-Chen approach
- On the time-dependent Fisher information of a density function
- scientific article; zbMATH DE number 2061735 (Why is no real title available?)
- A novel method to generating two-sided class of probability distributions
- A comment on rates of convergence for density function in extreme value theory and Rényi entropy
- From Boltzmann to random matrices and beyond
- Lectures on Entropy. I: Information-Theoretic Notions
- Non-uniform bounds and Edgeworth expansions in self-normalized limit theorems
- Information theory in mathematical statistics
- Absolutely continuous self-similar measures with exponential separation
- Two Remarks on Generalized Entropy Power Inequalities
- Existence of Stein kernels under a spectral gap, and discrepancy bounds
- scientific article; zbMATH DE number 1149166 (Why is no real title available?)
- On fuzzy theory for econometrics
- On kurtoses of two symmetric or asymmetric populations
- An integral representation of the relative entropy
- On the rate of convergence in the central limit theorem for hierarchical Laplacians
- Information in Probability: Another Information-Theoretic Proof of a Finite de Finetti Theorem
- The fractional Fisher information and the central limit theorem for stable laws
- Generalized Cramér-Rao relations for non-relativistic quantum systems
- Fisher information and convergence to stable laws
- On simulating truncated stable random variables
- On a connection between information and group lattices
- scientific article; zbMATH DE number 2026086 (Why is no real title available?)
- Quantitative CLTs on a Gaussian space: a survey of recent developments
- Rates of Fisher information convergence in the central limit theorem for nonlinear statistics
- Asymptotic approximation of nonparametric regression experiments with unknown variances
- Majorization and Rényi entropy inequalities via Sperner theory
- Information-theoretic convergence of extreme values to the Gumbel distribution
- Fisher information bounds and applications to SDEs with small noise
- Rényi divergence and the central limit theorem
- Entropy and the fourth moment phenomenon
- Further investigations of Rényi entropy power inequalities and an entropic characterization of \(s\)-concave densities
- scientific article; zbMATH DE number 48436 (Why is no real title available?)
- Multifractal diffusion entropy analysis: optimal bin width of probability histograms
- The generalized von Mises distribution
- Rate of convergence and Edgeworth-type expansion in the entropic central limit theorem
- Stein's method, logarithmic Sobolev and transport inequalities
- A Trajectorial Approach to the Gradient Flow Properties of Langevin--Smoluchowski Diffusions
- Entropy inequalities and the central limit theorem.
- Entropy for semi-Markov processes with Borel state spaces: asymptotic equirepartition properties and invariance principles
- Extension of de Bruijn's identity to dependent non-Gaussian noise channels
- The entropic Erdős-Kac limit theorem
- Convergence of densities of some functionals of Gaussian processes
- Nonuniform bounds in the Poisson approximation with applications to informational distances. II
- Bounds on the maximum of the density for sums of independent random variables
- Convergence of Markov chains in information divergence
- Upper bounds for Fisher information
- Direct approach to quantum extensions of Fisher information
- Convergence to stable laws in relative entropy
- Fisher information and the central limit theorem
- Larry Brown's contributions to parametric inference, decision theory and foundations: a survey
- On the Shannon entropy of the number of vertices with zero in-degree in randomly oriented hypergraphs
- Local limit theorems for densities in Orlicz spaces
- Jensen-variance distance measure: a unified framework for statistical and information measures
- On the maximum entropy principle and the minimization of the Fisher information in Tsallis statistics
- Stein's density method for multivariate continuous distributions
- Log-concavity, ultra-log-concavity, and a maximum entropy property of discrete compound Poisson measures
- Local limit theorems in free probability theory
- Entropy and the discrete central limit theorem
- A de Bruijn's identity for dependent random variables based on copula theory
This page was built for publication:
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q3158591)