EVALUATION OF MUTUAL INFORMATION ESTIMATORS FOR TIME SERIES
From MaRDI portal
Publication:5306410
DOI10.1142/S0218127409025298zbMath1183.37134arXiv0904.4753OpenAlexW3105208235MaRDI QIDQ5306410
Angeliki Papana, Dimitris Kugiumtzis
Publication date: 9 April 2010
Published in: International Journal of Bifurcation and Chaos (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/0904.4753
Time series analysis of dynamical systems (37M10) Computational methods for ergodic theory (approximation of invariant measures, computation of Lyapunov exponents, entropy, etc.) (37M25)
Related Items
DETECTION OF DIRECT CAUSAL EFFECTS AND APPLICATION TO EPILEPTIC ELECTROENCEPHALOGRAM ANALYSIS ⋮ A Nonparametric Causality Test: Detection of Direct Causal Effects in Multivariate Systems Using Corrected Partial Transfer Entropy
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Measuring synchronization in coupled model systems: a comparison of different approaches
- Distribution of mutual information from complete and incomplete data
- A two-dimensional mapping with a strange attractor
- Measuring the strangeness of strange attractors
- Testing for nonlinearity using redundancies: Quantitative and qualitative aspects
- Significance testing of information theoretic functionals
- Some applications for the useful mutual information
- Measuring stochastic dependence using \(\phi\)-divergence
- Timely detection of dynamical change in scalp EEG signals
- On optimal and data-based histograms
- Independent coordinates for strange attractors from mutual information
- On the histogram as a density estimator:L 2 theory
- Enlarged scaling ranges for the KS‐entropy and the information dimension
- Oscillation and Chaos in Physiological Control Systems
- Estimation of the information by an adaptive partitioning of the observation space
- Estimation of Entropy and Mutual Information
- Some Methods for Strengthening the Common χ 2 Tests
- Distribution of mutual information
- Selection of a kernel bandwidth for measuring dependence in hydrologic time series using the mutual information criterion