An introduction to logical entropy and its relation to Shannon entropy
From MaRDI portal
Publication:5495359
DOI10.1142/S1793351X13400059zbMATH Open1293.94031arXiv2112.01966OpenAlexW3125798703MaRDI QIDQ5495359FDOQ5495359
Authors: David P. Ellerman
Publication date: 4 August 2014
Published in: International Journal of Semantic Computing (Search for Journal in Brave)
Abstract: We live in the information age. Claude Shannon, as the father of the information age, gave us a theory of communications that quantified an "amount of information," but, as he pointed out, "no concept of information itself was defined." Logical entropy provides that definition. Logical entropy is the natural measure of the notion of information based on distinctions, differences, distinguishability, and diversity. It is the (normalized) quantitative measure of the distinctions of a partition on a set--just as the Boole-Laplace logical probability is the normalized quantitative measure of the elements of a subset of a set. And partitions and subsets are mathematically dual concepts--so the logic of partitions is dual in that sense to the usual Boolean logic of subsets, and hence the name "logical entropy." The logical entropy of a partition has a simple interpretation as the probability that a distinction or dit (elements in different blocks) is obtained in two independent draws from the underlying set. The Shannon entropy is shown to also be based on this notion of information-as-distinctions; it is the average minimum number of binary partitions (bits) that need to be joined to make all the same distinctions of the given partition. Hence all the concepts of simple, joint, conditional, and mutual logical entropy can be transformed into the corresponding concepts of Shannon entropy by a uniform non-linear dit-bit transform. And finally logical entropy linearizes naturally to the corresponding quantum concept. The quantum logical entropy of an observable applied to a state is the probability that two different eigenvalues are obtained in two independent projective measurements of that observable on that state. Keywords: logical entropy, Shannon entropy, partitions, MaxEntropy, quantum logical entropy, von Neumann entropy
Full work available at URL: https://arxiv.org/abs/2112.01966
Recommendations
Cites Work
- Title not available (Why is that?)
- Title not available (Why is that?)
- Measurement of Diversity
- Title not available (Why is that?)
- Diversity as a Concept and its Measurement
- Diversity and dissimilarity coefficients: A unified approach
- Towards a unifying approach to diversity measures: bridging the gap between the Shannon entropy and Rao's quadratic index
- The logic of partitions: introduction to the dual of the logic of subsets
- Counting distinctions: on the conceptual foundations of Shannon's information theory
Cited In (18)
- Nomclust 2.0: an R package for hierarchical clustering of objects characterized by nominal variables
- Kullback-Leibler divergence and mutual information of experiments in the fuzzy case
- Tsallis entropy of partitions in quantum logics
- Relative model of the logical entropy of sub-\(\sigma_\Theta\)-algebras
- Entropic geometry from logic
- A note on entropy of logic
- Kolmogorov-Sinai type logical entropy for generalized simultaneous measurements
- On local Tsallis entropy of relative dynamical systems
- Entropies and dynamical systems in Riesz MV-algebras
- Logical entropy of dynamical systems
- On entropy of a logical system
- Logical entropy and aggregation of fuzzy orthopartitions
- Tsallis entropy of fuzzy dynamical systems
- Logical entropy of dynamical systems -- a general model
- An introduction of logical entropy on sequential effect algebra
- An application of Ky Fan inequality: on Kullback-Leibler divergence between a probability distribution and its negation
- The logic of partitions: introduction to the dual of the logic of subsets
- Orthopartitions in knowledge representation and machine learning
This page was built for publication: An introduction to logical entropy and its relation to Shannon entropy
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q5495359)