Significance testing of information theoretic functionals
From MaRDI portal
Recommendations
- Evaluation of statistical relationship of random variables via mutual information
- Nonparametric independence testing via mutual information
- Test of independence of a class of bivariate distributions based on mutual information
- Mutual information functions versus correlation functions.
- Some applications for the useful mutual information
Cites work
- scientific article; zbMATH DE number 1258151 (Why is no real title available?)
- scientific article; zbMATH DE number 695323 (Why is no real title available?)
- scientific article; zbMATH DE number 956560 (Why is no real title available?)
- Coarse-grained entropy rates for characterization of complex time series
- Detecting nonlinearity in multivariate time series
- Estimating the correlation dimension of an attractor from noisy and small datasets based on re-embedding
- Extracting qualitative dynamics from experimental data
- Measuring the strangeness of strange attractors
- Singular-value decomposition in attractor reconstruction: Pitfalls and precautions
- Testing for nonlinearity using redundancies: Quantitative and qualitative aspects
Cited in
(9)- Operational Interpretation of Rényi Information Measures via Composite Hypothesis Testing Against Product and Markov Distributions
- A nonlinear correlation measure for multivariable data set
- Evaluation of mutual information estimators for time series
- Surrogate time series.
- Estimating the errors on measured entropy and mutual information
- Information-Theoretic Distribution Test with Application to Normality
- Information transfer in continuous processes
- Evaluation of statistical relationship of random variables via mutual information
- Mutual information and redundancy for categorical data
This page was built for publication: Significance testing of information theoretic functionals
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q1373910)