Fuzzy entropy and conditioning (Q1092002)

From MaRDI portal
scientific article
Language Label Description Also known as
English
Fuzzy entropy and conditioning
scientific article

    Statements

    Fuzzy entropy and conditioning (English)
    0 references
    0 references
    1986
    0 references
    A new nonprobabilistic entropy measure is introduced in the context of fuzzy sets or messages. Fuzzy units, or fits, replace bits in a new framework of fuzzy information theory. An appropriate measure of entropy or fuzziness of messages is shown to be a simple ratio of distances: the distances between the fuzzy message and its nearest and farthest nonfuzzy neighbors. Fuzzy conditioning is examined as the degree of subsethood (submessagehood) of one fuzzy set or message in another. This quantity is shown to behave as a conditional probability in many contexts. It is also shown that the entropy of A is the degree to which \(A\cup A^ c\) is a subset of \(A\cap A^ c\), an intuitive relationship that cannot occur in probability theory. This theory of subsethood is then shown to solve one of the major problems with Bayes-theorem learning and its variants - the problem of requiring that the space of alternatives be partitioned into disjoint exhaustive hypotheses. Any fuzzy subsets will do. However, a rough inverse relationship holds between number and fuzziness of partitions and the information gained from experience. All results reduce to fuzzy cardinality.
    0 references
    0 references
    0 references
    0 references
    0 references
    fuzzy sets
    0 references
    fuzzy information theory
    0 references
    measure of entropy
    0 references
    fuzziness of messages
    0 references
    Fuzzy conditioning
    0 references
    0 references