An axiomatic derivation of the coding-theoretic possibilistic entropy (Q1826609): Difference between revisions
From MaRDI portal
Latest revision as of 18:37, 6 June 2024
scientific article
Language | Label | Description | Also known as |
---|---|---|---|
English | An axiomatic derivation of the coding-theoretic possibilistic entropy |
scientific article |
Statements
An axiomatic derivation of the coding-theoretic possibilistic entropy (English)
0 references
6 August 2004
0 references
The study is concerned with defining a concept of entropy in the framework of possibility theory. In this setting the possibility \(\Pi= [\pi_1,\pi_2,\dots, \pi_k]\) is defined over a finite alphabet \({\mathbf A}= \{a_1,a_2,\dots, a_k\}\) such that \(\pi_i= \Pi(a_i)\) and \(\max\pi_i= 1\) for any set \(A\). Furthermore \(\Pi(A)= \max p_i\) with the maximum taken over all elements belonging to \(A\). Evidently \(\Pi(\emptyset)= 0\) and \(\Pi({\mathbf A})= 1\). Assuming the coding-theoretic standpoint and admitting a stationary and non-interactive possibilistic information source, the possibilistic entropy \(H_\varepsilon(\Pi)\) is defined in the form \(H_\varepsilon(\Pi)= \log|\{a_i, \pi_i> \varepsilon\}|\), \(\varepsilon\in [0,1)\) with \(|\cdot|\) denoting the cardinality of the corresponding set. A so-called strong-constraint possibilistic entropy is introduced as well in which the strict inequality being used in the previous construct is replaced by its weak counterpart. A number of detailed properties of possibilistic entropies are provided. Uniqueness conditions are also presented. The two ``standard'' measures of nonspecificity such as Hartley (existing in set theory) and U-uncertainty (encountered in possibility theory) are used to gauge the properties of the proposed concept.
0 references
measures of information
0 references
possibility theory
0 references
possibilistic entropy
0 references
non-specificity
0 references
Shannon entropy
0 references
Harley measure
0 references
U-uncertainty
0 references
fuzzy sets
0 references