Two measures of sample entropy

From MaRDI portal
Publication:1332898

DOI10.1016/0167-7152(94)90046-9zbMath0805.62009OpenAlexW2070879425MaRDI QIDQ1332898

Kurt Pflughoeft, Ehsan S. Soofi, Nader Ebrahimi

Publication date: 6 February 1995

Published in: Statistics \& Probability Letters (Search for Journal in Brave)

Full work available at URL: https://doi.org/10.1016/0167-7152(94)90046-9




Related Items (47)

A comprehensive empirical power comparison of univariate goodness-of-fit tests for the Laplace distributionNonparametric probability density functions of entropy estimators applied to testing the Rayleigh distributionTests of fit for the Gumbel distribution: EDF-based tests against entropy-based testsKullback-Leibler divergence for Bayesian nonparametric model checkingGoodness-of-fit test based on correcting moments of modified entropy estimatorMonte Carlo comparison of five exponentiality tests using different entropy estimatesExponentiality test based on alpha-divergence and gamma-divergenceMaximum entropy principle and statistical inference on condensed ordered dataGoodness of fit test using Lin-Wong divergence based on Type-I censored dataEstimation of entropy using random samplingA test for independence via Bayesian nonparametric estimation of mutual informationTest for normality based on two new estimators of entropyGoodness-of-fit tests for lifetime distributions based on Type II censored dataA new estimator of Kullback–Leibler information and its application in goodness of fit testsVarentropy estimators with applications in testing uniformityA new estimator of Kullback–Leibler information based on a local linear regression and its application in goodness-of-fit testsNonparametric estimation of quantile-based entropy functionBayes Estimate and Inference for Entropy and Information Index of FitGoodness-of-fit tests based on Verma Kullback–Leibler informationCensored Kullback-Leibler Information and Goodness-of-Fit Test with Type II Censored DataAn efficient correction to the density-based empirical likelihood ratio goodness-of-fit test for the inverse Gaussian distributionNew goodness of fit tests for the Cauchy distributionTesting Exponentiality Based on Type II Censored Data and a New cdf EstimatorExtropy estimators with applications in testing uniformityA Bayesian nonparametric estimation to entropyOn testing uniformity using an information-theoretic measureAn estimation of Phi divergence and its application in testing normalityGoodness-of-Fit Tests Based on Correcting Moments of Entropy EstimatorsTesting normality based on new entropy estimatorsA wide review on exponentiality tests and two competitive proposals with application on reliabilityMoments of nonparametric probability density functions of entropy estimators applied to testing the inverse Gaussian distributionAn entropy test for the Rayleigh distribution and power comparisonTwo new estimators of entropy for testing normalityTesting uniformity based on new entropy estimatorsModified entropy estimators for testing normalityTests of fit for the Laplace distribution based on correcting moments of entropy estimatorsEntropy estimation and goodness-of-fit tests for the inverse Gaussian and Laplace distributions using paired ranked set samplingNon-parametric estimation of the extropy and the entropy measures based on progressive type-II censored data with testing uniformityNew indices of sample informational energy with application to testing uniformityNew Entropy Estimator with an Application to Test of NormalityA new measure of entropy of continuous random variableOn the entropy estimatorsOn Entropy-Based Test of Exponentiality in Ranked Set SamplingEstimation of entropy and extropy based on right censored data: a Bayesian non-parametric approachTesting Normality Using Transformed DataAn entropy characterization of the inverse Gaussian distribution and related goodness-of-fit testSilver jubilee issue



Cites Work


This page was built for publication: Two measures of sample entropy