Estimating the errors on measured entropy and mutual information
From MaRDI portal
Publication:1962437
DOI10.1016/S0167-2789(98)00269-3zbMATH Open0935.94013OpenAlexW2067211778MaRDI QIDQ1962437FDOQ1962437
Publication date: 31 January 2000
Published in: Physica D (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1016/s0167-2789(98)00269-3
Recommendations
Measures of information, entropy (94A17) Symbolic dynamics (37B10) Applications of dynamical systems (37N99)
Cites Work
- Title not available (Why is that?)
- Finite sample effects in sequence analysis
- Testing for nonlinearity using redundancies: Quantitative and qualitative aspects
- Coarse-grained entropy rates for characterization of complex time series
- Detecting nonlinearity in multivariate time series
- Independent coordinates for strange attractors from mutual information
- Significance testing of information theoretic functionals
- Measuring statistical dependences in a time series
- Extraction of delay information from chaotic time series based on information entropy
- Information and entropy in strange attractors
- Title not available (Why is that?)
- Singular-value decomposition in attractor reconstruction: Pitfalls and precautions
Cited In (24)
- Some relations between mutual information and estimation error in Wiener space
- A nonlinear correlation measure for multivariable data set
- Scaling invariance embedded in very short time series: a factorial moment based diffusion entropy approach
- Factorized mutual information maximization
- Macroeconomic simulation comparison with a multivariate extension of the Markov information criterion
- NONPARAMETRIC DETECTION OF DEPENDENCES IN STOCHASTIC POINT PROCESSES
- Title not available (Why is that?)
- METHODS FOR QUANTIFYING THE CAUSAL STRUCTURE OF BIVARIATE TIME SERIES
- A Nonparametric Causality Test: Detection of Direct Causal Effects in Multivariate Systems Using Corrected Partial Transfer Entropy
- Using time-delayed mutual information to discover and interpret temporal correlation structure in complex populations
- On some entropy methods in data analysis
- Transition matrix analysis of earthquake magnitude sequences
- Information transfer in continuous processes
- Multi-camera piecewise planar object tracking with mutual information
- Investigation on the high-order approximation of the entropy bias
- A (ECONOPHYSICS) NOTE ON VOLATILITY IN EXCHANGE RATE TIME SERIES
- Estimating Entropy Rates with Bayesian Confidence Intervals
- Chaotic characteristics analysis of the sintering process system with unknown dynamic functions based on phase space reconstruction and chaotic invariables
- Analysis of symbolic sequences using the Jensen-Shannon divergence
- Tsallis conditional mutual information in investigating long range correlation in symbol sequences
- Recurrence plot statistics and the effect of embedding
- Analytical Calculation of Mutual Information between Weakly Coupled Poisson-Spiking Neurons in Models of Dynamically Gated Communication
- A statistical dynamics approach to the study of human health data: Resolving population scale diurnal variation in laboratory data
- Distribution of mutual information
This page was built for publication: Estimating the errors on measured entropy and mutual information
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q1962437)