Publication:3158591

From MaRDI portal


zbMath1061.60019MaRDI QIDQ3158591

Oliver Johnson

Publication date: 28 January 2005

Full work available at URL: http://ebooks.worldscinet.com/ISBN/9781860945373/toc.shtml


60F05: Central limit and other weak theorems

94A15: Information theory (general)

62B10: Statistical aspects of information-theoretic topics

94-01: Introductory exposition (textbooks, tutorial papers, etc.) pertaining to information and communication theory

60-02: Research exposition (monographs, survey articles) pertaining to probability theory

60Fxx: Limit theorems in probability theory


Related Items

On the rate of convergence in the central limit theorem for hierarchical Laplacians, A Trajectorial Approach to the Gradient Flow Properties of Langevin--Smoluchowski Diffusions, Further Investigations of Rényi Entropy Power Inequalities and an Entropic Characterization of s-Concave Densities, Two Remarks on Generalized Entropy Power Inequalities, On the Shannon entropy of the number of vertices with zero in-degree in randomly oriented hypergraphs, A DE BRUIJN'S IDENTITY FOR DEPENDENT RANDOM VARIABLES BASED ON COPULA THEORY, From Boltzmann to random matrices and beyond, A comment on rates of convergence for density function in extreme value theory and Rényi entropy, Information in Probability: Another Information-Theoretic Proof of a Finite de Finetti Theorem, Stein's density method for multivariate continuous distributions, Quasi-log concavity conjecture and its applications in statistics, Theory of \(\phi\)-Jensen variance and its applications in higher education, Fisher information and the fourth moment theorem, Rate of convergence and Edgeworth-type expansion in the entropic central limit theorem, Convergence to stable laws in relative entropy, Log-concavity, ultra-log-concavity, and a maximum entropy property of discrete compound Poisson measures, Fisher information in different types of perfect and imperfect ranked set samples from finite mixture models, Fisher information and convergence to stable laws, Berry-Esseen bounds in the entropic central limit theorem, An integral representation of the relative entropy, On a connection between information and group lattices, Generalized Cramér-Rao relations for non-relativistic quantum systems, The generalized von Mises distribution, Entropy for semi-Markov processes with Borel state spaces: asymptotic equirepartition properties and invariance principles, The entropic Erdős-Kac limit theorem, Direct approach to quantum extensions of Fisher information, Local limit theorems in free probability theory, Convergence of Markov chains in information divergence, On the time-dependent Fisher information of a density function, Rényi divergence and the central limit theorem, Multifractal diffusion entropy analysis: optimal bin width of probability histograms, Upper bounds for Fisher information, Larry Brown's contributions to parametric inference, decision theory and foundations: a survey, A novel method to generating two-sided class of probability distributions, Fisher information and the central limit theorem, Entropy and the fourth moment phenomenon, Bounds on the maximum of the density for sums of independent random variables, On simulating truncated stable random variables, Local limit theorems for densities in Orlicz spaces, Nonuniform bounds in the Poisson approximation with applications to informational distances. II, Majorization and Rényi entropy inequalities via Sperner theory, Existence of Stein kernels under a spectral gap, and discrepancy bounds, Information functionals with applications to random walk and statistics, Stein's method, logarithmic Sobolev and transport inequalities, Convergence of densities of some functionals of Gaussian processes, Asymptotic approximation of nonparametric regression experiments with unknown variances, Notes on superadditivity of Wigner-Yanase-Dyson information, On kurtoses of two symmetric or asymmetric populations, Poisson approximation in \(\chi^2\) distance by the Stein-Chen approach, Stability of Cramer’s Characterization of Normal Laws in Information Distances, On Fuzzy Theory for Econometrics, Extension of de Bruijn's identity to dependent non-Gaussian noise channels, Quantitative clts on a gaussian space: a survey of recent developments, On the maximum entropy principle and the minimization of the Fisher information in Tsallis statistics