A mutual information-based k-sample test for discrete distributions
From MaRDI portal
Publication:2953264
Recommendations
- Testing for differences among discrete distributions: an application of model-based clustering
- Nonparametric \(K\)-sample tests via dynamic slicing
- Dispersive comparison of distributions: A multisample testing problem
- \(k\)-sample test based on the common area of kernel density estimators
- A nonparametric approach to k-sample inference based on entropy
Cites work
- scientific article; zbMATH DE number 3354418 (Why is no real title available?)
- A Nonparametric Test for the General Two-Sample Problem
- Confidence tubes for multiple quantile plots via empirical likelihood
- Elements of Information Theory
- Empirical likelihood tests for two-sample problems via nonparametric density estimation
- Estimating the treatment effect in the two-sample problem with auxiliary information
- Estimation of Entropy and Mutual Information
- Exact tests based on the Baumgartner-Weiß-Schindler statistic -- a survey
- Mathematical statistics with applications.
- Mutual information in the frequency domain
- On a Test of Whether one of Two Random Variables is Stochastically Larger than the Other
- On the Distribution of the Two-Sample Cramer-von Mises Criterion
- On the Efficiency of Two-sample Mann-Whitney Test for Discrete Populations
- Semi-empirical likelihood ratio confidence intervals for the difference of two sample means
- Shannon entropy and mutual information for multivariate skew-elliptical distributions
- Simultaneous confidence bands for ratios of survival functions via empirical likelihood.
- The Kolmogorov-Smirnov, Cramer-von Mises Tests
- Time series analysis of categorical data using auto-mutual information
- Two-sample empirical likelihood method
Cited in
(1)
This page was built for publication: A mutual information-based \(k\)-sample test for discrete distributions
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2953264)