MONTE CARLO COMPARISON OF FOUR NORMALITY TESTS USING DIFFERENT ENTROPY ESTIMATES
From MaRDI portal
Publication:4787611
DOI10.1081/SAC-100107780zbMATH Open1008.62505OpenAlexW1972872178MaRDI QIDQ4787611FDOQ4787611
Domingo Morales, María Eugenia Castellanos, María Dolores Esteban, Igor Vajda
Publication date: 8 January 2003
Published in: Communications in Statistics: Simulation and Computation (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1081/sac-100107780
Recommendations
- Monte Carlo comparison of five exponentiality tests using different entropy estimates
- Entropy-based tests of uniformity: A Monte Carlo power comparison
- A test for normality based on the empirical distribution function
- Modified entropy estimators for testing normality
- Monte Carlo comparison of seven normality tests
Cited In (34)
- Goodness of fit test using Lin-Wong divergence based on Type-I censored data
- Title not available (Why is that?)
- On entropy goodness-of-fit test based on integrated distribution function
- Test for normality based on two new estimators of entropy
- Monte Carlo comparison of five exponentiality tests using different entropy estimates
- Modified Lilliefors goodness-of-fit test for normality
- Title not available (Why is that?)
- Efficiency of ranked set sampling in tests for normality
- New kernel-type estimator of Shanonn's entropy
- Entropy-based goodness-of-fit tests—a unifying framework: Application to DNA replication
- Nonparametric estimation of information-based measures of statistical dispersion
- Monte Carlo comparison of seven normality tests
- New Entropy Estimator with an Application to Test of Normality
- Testing normality based on new entropy estimators
- Simple and Exact Empirical Likelihood Ratio Tests for Normality Based on Moment Relations
- Uniform-in-bandwidth consistency for kernel-type estimators of Shannon's entropy
- Testing Normality Using Transformed Data
- A new estimator of Kullback–Leibler information and its application in goodness of fit tests
- A tool for systematically comparing the power of tests for normality
- On the entropy estimators
- An empirical likelihood ratio-based omnibus test for normality with an adjustment for symmetric alternatives
- Two new estimators of entropy for testing normality
- Goodness-of-fit tests based on Verma Kullback–Leibler information
- A new estimator of entropy and its application in testing normality
- Chebyshev’s Inequality for Nonparametric Testing with Small N and α in Microarray Research
- A comparison of various tests of normality
- Tests of fit for the Gumbel distribution: EDF-based tests against entropy-based tests
- Goodness-of-Fit Tests Based on Correcting Moments of Entropy Estimators
- Tests of goodness of fit based on Phi-divergence
- Entropy-based tests of uniformity: A Monte Carlo power comparison
- An estimation of Phi divergence and its application in testing normality
- Goodness-of-fit test based on correcting moments of modified entropy estimator
- Modified entropy estimators for testing normality
- A note on the strong consistency of nonparametric estimation of Shannon entropy in length-biased sampling
This page was built for publication: MONTE CARLO COMPARISON OF FOUR NORMALITY TESTS USING DIFFERENT ENTROPY ESTIMATES
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q4787611)