A new estimator of entropy and its application in testing normality
From MaRDI portal
Publication:3012688
DOI10.1080/00949650903005656zbMath1270.62021OpenAlexW2148376270MaRDI QIDQ3012688
Publication date: 6 July 2011
Published in: Journal of Statistical Computation and Simulation (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1080/00949650903005656
Lua error in Module:PublicationMSCList at line 37: attempt to index local 'msc_result' (a nil value).
Related Items (34)
Nonparametric probability density functions of entropy estimators applied to testing the Rayleigh distribution ⋮ Tests of fit for the Gumbel distribution: EDF-based tests against entropy-based tests ⋮ Kullback-Leibler divergence for Bayesian nonparametric model checking ⋮ Test of Normality Against Generalized Exponential Power Alternatives ⋮ Monte Carlo comparison of five exponentiality tests using different entropy estimates ⋮ Exponentiality test based on alpha-divergence and gamma-divergence ⋮ Goodness of fit test using Lin-Wong divergence based on Type-I censored data ⋮ Estimation of entropy using random sampling ⋮ Test for normality based on two new estimators of entropy ⋮ Efficiency of ranked set sampling in tests for normality ⋮ Goodness-of-fit tests for lifetime distributions based on Type II censored data ⋮ A new estimator of Kullback–Leibler information and its application in goodness of fit tests ⋮ Varentropy estimators with applications in testing uniformity ⋮ A new estimator of Kullback–Leibler information based on a local linear regression and its application in goodness-of-fit tests ⋮ Tests of symmetry based on the sample entropy of order statistics and power comparison ⋮ Goodness-of-fit tests based on Verma Kullback–Leibler information ⋮ A model-free Bayesian classifier ⋮ Tests of goodness of fit based on Phi-divergence ⋮ A Bayesian nonparametric estimation to entropy ⋮ An estimation of Phi divergence and its application in testing normality ⋮ Goodness-of-Fit Tests Based on Correcting Moments of Entropy Estimators ⋮ A wide review on exponentiality tests and two competitive proposals with application on reliability ⋮ Moments of nonparametric probability density functions of entropy estimators applied to testing the inverse Gaussian distribution ⋮ General treatment of goodness-of-fit tests based on Kullback–Leibler information ⋮ An entropy test for the Rayleigh distribution and power comparison ⋮ Two new estimators of entropy for testing normality ⋮ Kernel and CDF-based estimation of extropy and entropy from progressively type-II censoring with application for goodness of fit problems ⋮ Modified entropy estimators for testing normality ⋮ Tests of fit for the Laplace distribution based on correcting moments of entropy estimators ⋮ New indices of sample informational energy with application to testing uniformity ⋮ On the entropy estimators ⋮ On Entropy-Based Test of Exponentiality in Ranked Set Sampling ⋮ Estimation of entropy and extropy based on right censored data: a Bayesian non-parametric approach ⋮ Testing Normality Using Transformed Data
Cites Work
- Unnamed Item
- A Mathematical Theory of Communication
- Estimation of entropy and other functionals of a multivariate density
- On entropy-based goodness-of-fit tests
- On the estimation of entropy
- Entropy-Based Tests of Uniformity
- Entropy-based goodness-of-fit test for exponentiality
- Entropy estimators‐improvements and comparisons
- A new estimator of entropy
- MONTE CARLO COMPARISON OF FOUR NORMALITY TESTS USING DIFFERENT ENTROPY ESTIMATES
- Statistical estimation of the entropy of discrete random variables with a large number of outcomes
This page was built for publication: A new estimator of entropy and its application in testing normality