A novel and effective method for quantifying complexity of nonlinear time series
From MaRDI portal
Publication:6143073
Cites work
- Approximate entropy as a measure of system complexity.
- Bivariate extension of dynamic cumulative residual entropy
- Cumulative Residual Entropy: A New Measure of Information
- Fractional cumulative residual Kullback-Leibler information based on Tsallis entropy
- Fractional cumulative residual entropy
- General cumulative Kullback–Leibler information
- Generalized cumulative residual entropy and record values
- Generalized cumulative residual entropy for distributions with unrestricted supports
- Measuring information transfer by dispersion transfer entropy
- More on a new concept of entropy and information
- Multiscale transfer entropy: measuring information transfer on multiple time scales
- On Information and Sufficiency
- On cumulative entropies
- On cumulative residual Kullback-Leibler information
- On cumulative residual entropy of order statistics
- On the discrete cumulative residual entropy
- On the dynamic cumulative residual entropy
- Randomness and degrees of irregularity.
- Some characterization results on generalized cumulative residual entropy measure
- Some new results on the cumulative residual entropy
- Testing goodness-of-fit for exponential distribution based on cumulative residual entropy
This page was built for publication: A novel and effective method for quantifying complexity of nonlinear time series
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q6143073)