A mutual information-basedk-sample test for discrete distributions
From MaRDI portal
Publication:2953264
DOI10.1080/02664763.2014.899325zbMath1352.62063OpenAlexW2081204532MaRDI QIDQ2953264
Publication date: 4 January 2017
Published in: Journal of Applied Statistics (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1080/02664763.2014.899325
entropymutual informationdiscrete distributionstwo-sample testnonparametric tests\(k\)-sample testsports statistics
Nonparametric hypothesis testing (62G10) Sampling theory, sample surveys (62D05) Applications of statistics (62P99) Statistical aspects of information-theoretic topics (62B10)
Related Items (1)
Cites Work
- Unnamed Item
- Unnamed Item
- Mutual information in the frequency domain
- Semi-empirical likelihood ratio confidence intervals for the difference of two sample means
- Confidence tubes for multiple quantile plots via empirical likelihood
- Exact tests based on the Baumgartner-Weiß-Schindler statistic -- a survey
- Simultaneous confidence bands for ratios of survival functions via empirical likelihood.
- Two-sample empirical likelihood method
- Time series analysis of categorical data using auto-mutual information
- The Kolmogorov-Smirnov, Cramer-von Mises Tests
- Estimating the treatment effect in the two-sample problem with auxiliary information
- A Nonparametric Test for the General Two-Sample Problem
- Estimation of Entropy and Mutual Information
- Shannon Entropy and Mutual Information for Multivariate Skew‐Elliptical Distributions
- Empirical likelihood tests for two-sample problems via nonparametric density estimation
- Elements of Information Theory
- On the Efficiency of Two-sample Mann-Whitney Test for Discrete Populations
- On the Distribution of the Two-Sample Cramer-von Mises Criterion
- On a Test of Whether one of Two Random Variables is Stochastically Larger than the Other
This page was built for publication: A mutual information-basedk-sample test for discrete distributions