A mutual information-based k-sample test for discrete distributions
DOI10.1080/02664763.2014.899325zbMATH Open1352.62063OpenAlexW2081204532MaRDI QIDQ2953264FDOQ2953264
Authors: Robert Drake, Apratim Guha
Publication date: 4 January 2017
Published in: Journal of Applied Statistics (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1080/02664763.2014.899325
Recommendations
- Testing for differences among discrete distributions: an application of model-based clustering
- Nonparametric \(K\)-sample tests via dynamic slicing
- Dispersive comparison of distributions: A multisample testing problem
- \(k\)-sample test based on the common area of kernel density estimators
- A nonparametric approach to k-sample inference based on entropy
entropymutual informationdiscrete distributionstwo-sample testnonparametric tests\(k\)-sample testsports statistics
Statistical aspects of information-theoretic topics (62B10) Nonparametric hypothesis testing (62G10) Sampling theory, sample surveys (62D05) Applications of statistics (62P99)
Cites Work
- Elements of Information Theory
- On a Test of Whether one of Two Random Variables is Stochastically Larger than the Other
- A Nonparametric Test for the General Two-Sample Problem
- The Kolmogorov-Smirnov, Cramer-von Mises Tests
- Confidence tubes for multiple quantile plots via empirical likelihood
- Two-sample empirical likelihood method
- Estimation of Entropy and Mutual Information
- Simultaneous confidence bands for ratios of survival functions via empirical likelihood.
- Semi-empirical likelihood ratio confidence intervals for the difference of two sample means
- Shannon entropy and mutual information for multivariate skew-elliptical distributions
- On the Distribution of the Two-Sample Cramer-von Mises Criterion
- Mutual information in the frequency domain
- Mathematical statistics with applications.
- Title not available (Why is that?)
- Empirical likelihood tests for two-sample problems via nonparametric density estimation
- Time series analysis of categorical data using auto-mutual information
- Exact tests based on the Baumgartner-Weiß-Schindler statistic -- a survey
- On the Efficiency of Two-sample Mann-Whitney Test for Discrete Populations
- Estimating the treatment effect in the two-sample problem with auxiliary information
Cited In (1)
This page was built for publication: A mutual information-based \(k\)-sample test for discrete distributions
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2953264)