Estimation of mutual information by the fuzzy histogram
From MaRDI portal
Recommendations
- Bayesian and quasi-Bayesian estimators for mutual information from discrete data
- Mutual information equals copula entropy
- A mutual information estimator with exponentially decaying bias
- A statistic to estimate the variance of the histogram-based mutual information estimator based on dependent pairs of observations
- Distribution of mutual information from complete and incomplete data
Cites work
- A nonlinear correlation measure for multivariable data set
- Canonical dependency analysis based on squared-loss mutual information
- Entropy expressions for multivariate continuous distributions
- Estimation of the information by an adaptive partitioning of the observation space
- Expressions for Rényi and Shannon entropies for multivariate distributions
- Fuzzy histogram and density estimation
- Histogram density estimators based upon a fuzzy partition
- Interval-valued probability density estimation based on quasi-continuous histograms: proof of the conjecture
- Justification and numerical realization of the uniform method for finding point estimates of interval elicited scaling constants
- Mutual information-based selection of optimal spatial-temporal patterns for single-trial EEG-based BCIs
Cited in
(1)
This page was built for publication: Estimation of mutual information by the fuzzy histogram
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q1794453)