Markov Categories and Entropy
From MaRDI portal
Publication:6507387
DOI10.1109/TIT.2023.3328825arXiv2212.11719MaRDI QIDQ6507387FDOQ6507387
Authors: Paolo Perrone
Abstract: Markov categories are a novel framework to describe and treat problems in probability and information theory. In this work we combine the categorical formalism with the traditional quantitative notions of entropy, mutual information, and data processing inequalities. We show that several quantitative aspects of information theory can be captured by an enriched version of Markov categories, where the spaces of morphisms are equipped with a divergence or even a metric. As it is customary in information theory, mutual information can be defined as a measure of how far a joint source is from displaying independence of its components. More strikingly, Markov categories give a notion of determinism for sources and channels, and we can define entropy exactly by measuring how far a source or channel is from being deterministic. This recovers Shannon and R'enyi entropies, as well as the Gini-Simpson index used in ecology to quantify diversity, and it can be used to give a conceptual definition of generalized entropy.
Graphical methods in statistics (62A09) Measures of information, entropy (94A17) Axioms; other general questions in probability (60A05) Semantics in the theory of computing (68Q55) Enriched categories (over closed or monoidal categories) (18D20) Categories of networks and processes, compositionality (18M35)
This page was built for publication: Markov Categories and Entropy
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q6507387)