MONTE CARLO COMPARISON OF FOUR NORMALITY TESTS USING DIFFERENT ENTROPY ESTIMATES
From MaRDI portal
Publication:4787611
DOI10.1081/SAC-100107780zbMATH Open1008.62505OpenAlexW1972872178MaRDI QIDQ4787611FDOQ4787611
Authors: Igor Vajda, María Dolores Esteban, María Eugenia Castellanos, Domingo Morales
Publication date: 8 January 2003
Published in: Communications in Statistics: Simulation and Computation (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1081/sac-100107780
Recommendations
- Monte Carlo comparison of five exponentiality tests using different entropy estimates
- Entropy-based tests of uniformity: A Monte Carlo power comparison
- A test for normality based on the empirical distribution function
- Modified entropy estimators for testing normality
- Monte Carlo comparison of seven normality tests
Cited In (36)
- Goodness of fit test using Lin-Wong divergence based on Type-I censored data
- On entropy goodness-of-fit test based on integrated distribution function
- Test for normality based on two new estimators of entropy
- Goodness-of-fit tests based on Verma Kullback-Leibler information
- Monte Carlo comparison of five exponentiality tests using different entropy estimates
- Modified Lilliefors goodness-of-fit test for normality
- Efficiency of ranked set sampling in tests for normality
- New kernel-type estimator of Shanonn's entropy
- Simple and exact empirical likelihood ratio tests for normality based on moment relations
- New entropy estimator with an application to test of normality
- A test for normality based on the information energy
- Goodness-of-fit tests based on correcting moments of entropy estimators
- Tests of goodness of fit based on phi-divergence
- Nonparametric estimation of information-based measures of statistical dispersion
- Normality tests for very small sample sizes
- Monte Carlo comparison of seven normality tests
- Prognosis and optimization of homogeneous Markov message handling networks.
- Testing normality based on new entropy estimators
- Entropy-based goodness-of-fit tests -- a unifying framework: application to DNA replication
- Big data and the central limit theorem: a statistical legend
- Uniform-in-bandwidth consistency for kernel-type estimators of Shannon's entropy
- Testing Normality Using Transformed Data
- A new estimator of Kullback–Leibler information and its application in goodness of fit tests
- A tool for systematically comparing the power of tests for normality
- On the entropy estimators
- An empirical likelihood ratio-based omnibus test for normality with an adjustment for symmetric alternatives
- Two new estimators of entropy for testing normality
- A new estimator of entropy and its application in testing normality
- Chebyshev’s Inequality for Nonparametric Testing with Small N and α in Microarray Research
- A comparison of various tests of normality
- Tests of fit for the Gumbel distribution: EDF-based tests against entropy-based tests
- Entropy-based tests of uniformity: A Monte Carlo power comparison
- An estimation of Phi divergence and its application in testing normality
- Goodness-of-fit test based on correcting moments of modified entropy estimator
- Modified entropy estimators for testing normality
- A note on the strong consistency of nonparametric estimation of Shannon entropy in length-biased sampling
This page was built for publication: MONTE CARLO COMPARISON OF FOUR NORMALITY TESTS USING DIFFERENT ENTROPY ESTIMATES
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q4787611)