Shannon entropy: a rigorous notion at the crossroads between probability, information theory, dynamical systems and statistical physics
From MaRDI portal
Publication:5740369
DOI10.1017/S0960129512000783zbMath1342.94059OpenAlexW2134287462WikidataQ63980027 ScholiaQ63980027MaRDI QIDQ5740369
Publication date: 26 July 2016
Published in: Mathematical Structures in Computer Science (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1017/s0960129512000783
Related Items (20)
Entropies from coarse-graining: convex polytopes vs. ellipsoids ⋮ Reconstruction methods for networks: the case of economic and financial systems ⋮ New bounds for Shannon, relative and Mandelbrot entropies via Hermite interpolating polynomial ⋮ Estimation of divergences on time scales via the Green function and Fink's identity ⋮ Combinatorial micro-macro dynamical systems ⋮ Link prediction based on the mutual information with high-order clustering structure of nodes in complex networks ⋮ Generalizations of cyclic refinements of Jensen's inequality by Lidstone's polynomial with applications in information theory ⋮ Shannon information entropy, soliton clusters and Bose-Einstein condensation in log gravity ⋮ On the correct implementation of the Hanurav-Vijayan selection procedure for unequal probability sampling without replacement ⋮ Phase correlations in chaotic dynamics: a Shannon entropy measure ⋮ Correlations in area preserving maps: a Shannon entropy approach ⋮ The Shannon entropy as a measure of diffusion in multidimensional dynamical systems ⋮ Shannon entropy diffusion estimates: sensitivity on the parameters of the method ⋮ New bounds for Shannon, relative and Mandelbrot entropies via Abel-Gontscharoff interpolating polynomial ⋮ Information entropy of activation process: Application for low-temperature fluctuations of a myoglobin molecule ⋮ Several new cyclic Jensen type inequalities and their applications ⋮ Global dynamics and diffusion in the rational standard map ⋮ Pull out all the stops: Textual analysis via punctuation sequences ⋮ Generalized cyclic Jensen and information inequalities ⋮ The Shannon entropy: an efficient indicator of dynamical stability
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- A Mathematical Theory of Communication
- Information in statistical physics
- Time-reversed dynamical entropy and irreversibility in Markovian random processes
- On the second law of thermodynamics and the piston problem
- Entropy balance in distributed reversible Gray-Scott model
- Toward a quantitative theory of self-generated complexity
- A sandwich proof of the Shannon-McMillan-Breiman theorem
- Exponential convergence to equilibrium for a class of random-walk models
- I-divergence geometry of probability distributions and minimization problems
- Toward a probabilistic approach to complex systems
- Note on two theorems in nonequilibrium statistical mechanics
- Measures of statistical complexity: why?
- Kolmogorov's contributions to the foundations of probability
- The entropy of a binary hidden Markov process
- The Individual Ergodic Theorem of Information Theory
- Information Theory and Statistical Mechanics
- The discrete versus continuous controversy in physics
- Some asymptotic properties of the entropy of a stationary ergodic data source with applications to data compression
- An entropy concentration theorem: applications in artificial intelligence and descriptive statistics
- Chaos and Coarse Graining in Statistical Mechanics
- Entropy, thermostats, and chaotic hypothesis
- Irreversibility and Heat Generation in the Computing Process
- Maximum entropy and conditional probability
- On the Complexity of Finite Sequences
- A universal algorithm for sequential data compression
- Complexity-based induction systems: Comparisons and convergence theorems
- Compression of individual sequences via variable-rate coding
- Complexity
- Chaotic dynamics, fluctuations, nonequilibrium ensembles
- Renyi's divergence and entropy rates for finite alphabet Markov sources
- The method of types [information theory]
- Extending the definition of entropy to nonequilibrium steady states
- Elements of Information Theory
- On the Length of Programs for Computing Finite Binary Sequences
- The definition of random sequences
- On Information and Sufficiency
- Maxwell's Demon Cannot Operate: Information and Entropy. I
- Physical Entropy and Information. II
- The Basic Theorems of Information Theory
- The Negentropy Principle of Information
- Probability, Frequency and Reasonable Expectation
- A generative theory of shape
This page was built for publication: Shannon entropy: a rigorous notion at the crossroads between probability, information theory, dynamical systems and statistical physics