A new estimator of entropy and its application in testing normality

From MaRDI portal
Revision as of 22:35, 3 February 2024 by Import240129110113 (talk | contribs) (Created automatically from import240129110113)
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)

Publication:3012688


DOI10.1080/00949650903005656zbMath1270.62021MaRDI QIDQ3012688

Hadi Alizadeh Noughabi

Publication date: 6 July 2011

Published in: Journal of Statistical Computation and Simulation (Search for Journal in Brave)

Full work available at URL: https://doi.org/10.1080/00949650903005656


62G10: Nonparametric hypothesis testing

62B10: Statistical aspects of information-theoretic topics


Related Items

Goodness-of-fit tests based on Verma Kullback–Leibler information, Goodness-of-Fit Tests Based on Correcting Moments of Entropy Estimators, A wide review on exponentiality tests and two competitive proposals with application on reliability, Moments of nonparametric probability density functions of entropy estimators applied to testing the inverse Gaussian distribution, Nonparametric probability density functions of entropy estimators applied to testing the Rayleigh distribution, Tests of fit for the Gumbel distribution: EDF-based tests against entropy-based tests, Exponentiality test based on alpha-divergence and gamma-divergence, Goodness of fit test using Lin-Wong divergence based on Type-I censored data, Test for normality based on two new estimators of entropy, Efficiency of ranked set sampling in tests for normality, Goodness-of-fit tests for lifetime distributions based on Type II censored data, A new estimator of Kullback–Leibler information and its application in goodness of fit tests, Tests of goodness of fit based on Phi-divergence, An estimation of Phi divergence and its application in testing normality, General treatment of goodness-of-fit tests based on Kullback–Leibler information, An entropy test for the Rayleigh distribution and power comparison, Modified entropy estimators for testing normality, Tests of fit for the Laplace distribution based on correcting moments of entropy estimators, On Entropy-Based Test of Exponentiality in Ranked Set Sampling, Test of Normality Against Generalized Exponential Power Alternatives, Monte Carlo comparison of five exponentiality tests using different entropy estimates, Varentropy estimators with applications in testing uniformity, A new estimator of Kullback–Leibler information based on a local linear regression and its application in goodness-of-fit tests, Estimation of entropy using random sampling, Tests of symmetry based on the sample entropy of order statistics and power comparison, Kernel and CDF-based estimation of extropy and entropy from progressively type-II censoring with application for goodness of fit problems, Estimation of entropy and extropy based on right censored data: a Bayesian non-parametric approach, Kullback-Leibler divergence for Bayesian nonparametric model checking, A model-free Bayesian classifier, A Bayesian nonparametric estimation to entropy, New indices of sample informational energy with application to testing uniformity, Two new estimators of entropy for testing normality, On the entropy estimators, Testing Normality Using Transformed Data



Cites Work