Limit theorems for nonparametric sample entropy estimators
From MaRDI portal
Publication:1579529
DOI10.1016/S0167-7152(00)00025-0zbMath0982.62031MaRDI QIDQ1579529
Publication date: 7 April 2002
Published in: Statistics \& Probability Letters (Search for Journal in Brave)
order statistics; consistency; heavy tails; Shannon entropy; entropy central limit theorem; m-spacings; Vasicek sample entropy
62G07: Density estimation
60F05: Central limit and other weak theorems
62B10: Statistical aspects of information-theoretic topics
Related Items
ON ENTROPY BASED TESTS FOR EXPONENTIALITY, On testing uniformity using an information-theoretic measure, Independent Component Analysis and Immunization: An Exploratory Study, Goodness-of-Fit Tests Based on Correcting Moments of Entropy Estimators, On entropy goodness-of-fit test based on integrated distribution function, An Estimator of Shannon Entropy of Beta-Generated Distributions and a Goodness-of-Fit Test, A new estimator of Kullback–Leibler information and its application in goodness of fit tests, General treatment of goodness-of-fit tests based on Kullback–Leibler information, An empirical likelihood ratio based goodness-of-fit test for inverse Gaussian distributions, A class of Rényi information estimators for multidimensional densities, Uniform-in-bandwidth consistency for kernel-type estimators of Shannon's entropy, New kernel-type estimator of Shanonn's entropy, New Entropy Estimator with an Application to Test of Normality
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- A test for multivariate normality based on sample entropy and projection pursuit
- Distribution of quantiles in samples from a bivariate population
- On the Estimation of Functionals of the Probability Density and Its Derivatives
- A nonparametric estimation of the entropy for absolutely continuous distributions (Corresp.)
- On a Simple Estimate of the Reciprocal of the Density Function