Two measures of sample entropy
From MaRDI portal
Publication:1332898
DOI10.1016/0167-7152(94)90046-9zbMath0805.62009OpenAlexW2070879425MaRDI QIDQ1332898
Kurt Pflughoeft, Ehsan S. Soofi, Nader Ebrahimi
Publication date: 6 February 1995
Published in: Statistics \& Probability Letters (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1016/0167-7152(94)90046-9
Related Items (47)
A comprehensive empirical power comparison of univariate goodness-of-fit tests for the Laplace distribution ⋮ Nonparametric probability density functions of entropy estimators applied to testing the Rayleigh distribution ⋮ Tests of fit for the Gumbel distribution: EDF-based tests against entropy-based tests ⋮ Kullback-Leibler divergence for Bayesian nonparametric model checking ⋮ Goodness-of-fit test based on correcting moments of modified entropy estimator ⋮ Monte Carlo comparison of five exponentiality tests using different entropy estimates ⋮ Exponentiality test based on alpha-divergence and gamma-divergence ⋮ Maximum entropy principle and statistical inference on condensed ordered data ⋮ Goodness of fit test using Lin-Wong divergence based on Type-I censored data ⋮ Estimation of entropy using random sampling ⋮ A test for independence via Bayesian nonparametric estimation of mutual information ⋮ Test for normality based on two new estimators of entropy ⋮ Goodness-of-fit tests for lifetime distributions based on Type II censored data ⋮ A new estimator of Kullback–Leibler information and its application in goodness of fit tests ⋮ Varentropy estimators with applications in testing uniformity ⋮ A new estimator of Kullback–Leibler information based on a local linear regression and its application in goodness-of-fit tests ⋮ Nonparametric estimation of quantile-based entropy function ⋮ Bayes Estimate and Inference for Entropy and Information Index of Fit ⋮ Goodness-of-fit tests based on Verma Kullback–Leibler information ⋮ Censored Kullback-Leibler Information and Goodness-of-Fit Test with Type II Censored Data ⋮ An efficient correction to the density-based empirical likelihood ratio goodness-of-fit test for the inverse Gaussian distribution ⋮ New goodness of fit tests for the Cauchy distribution ⋮ Testing Exponentiality Based on Type II Censored Data and a New cdf Estimator ⋮ Extropy estimators with applications in testing uniformity ⋮ A Bayesian nonparametric estimation to entropy ⋮ On testing uniformity using an information-theoretic measure ⋮ An estimation of Phi divergence and its application in testing normality ⋮ Goodness-of-Fit Tests Based on Correcting Moments of Entropy Estimators ⋮ Testing normality based on new entropy estimators ⋮ A wide review on exponentiality tests and two competitive proposals with application on reliability ⋮ Moments of nonparametric probability density functions of entropy estimators applied to testing the inverse Gaussian distribution ⋮ An entropy test for the Rayleigh distribution and power comparison ⋮ Two new estimators of entropy for testing normality ⋮ Testing uniformity based on new entropy estimators ⋮ Modified entropy estimators for testing normality ⋮ Tests of fit for the Laplace distribution based on correcting moments of entropy estimators ⋮ Entropy estimation and goodness-of-fit tests for the inverse Gaussian and Laplace distributions using paired ranked set sampling ⋮ Non-parametric estimation of the extropy and the entropy measures based on progressive type-II censored data with testing uniformity ⋮ New indices of sample informational energy with application to testing uniformity ⋮ New Entropy Estimator with an Application to Test of Normality ⋮ A new measure of entropy of continuous random variable ⋮ On the entropy estimators ⋮ On Entropy-Based Test of Exponentiality in Ranked Set Sampling ⋮ Estimation of entropy and extropy based on right censored data: a Bayesian non-parametric approach ⋮ Testing Normality Using Transformed Data ⋮ An entropy characterization of the inverse Gaussian distribution and related goodness-of-fit test ⋮ Silver jubilee issue
Cites Work
- Estimation of entropy and other functionals of a multivariate density
- Estimation of the entropy and information of absolutely continuous random variables
- On the Estimation of Functionals of the Probability Density and Its Derivatives
- A nonparametric estimation of the entropy for absolutely continuous distributions (Corresp.)
- Unnamed Item
- Unnamed Item
- Unnamed Item
This page was built for publication: Two measures of sample entropy