Pages that link to "Item:Q4404131"
From MaRDI portal
The following pages link to Why the Shannon and Hartley entropies are ‘natural’ (Q4404131):
Displaying 22 items.
- Equivalence of partition functions leads to classification of entropies and means (Q406120) (← links)
- Characterizing entropy in statistical physics and in quantum information theory (Q474897) (← links)
- Tribute to a distinguished Professor János Aczél at 85 (Q623421) (← links)
- Measuring information beyond communication theory - why some generalized information measures may be useful, others not (Q799647) (← links)
- Measures of uncertainty for imprecise probabilities: an axiomatic approach (Q985132) (← links)
- Revisiting prior distributions. II: Implications of the physical prior in maximum entropy analysis (Q1001518) (← links)
- Where do we stand on measures of uncertainty, ambiguity, fuzziness, and the like? (Q1096591) (← links)
- Uniqueness of information measure in the theory of evidence (Q1100181) (← links)
- The fundamental equation of information and its generalizations (Q1112004) (← links)
- A characterization of the Segal entropy (Q1135503) (← links)
- On some generalized functional equations in information theory (Q1142374) (← links)
- The many facets of entropy (Q1198547) (← links)
- A unique characterization of the generalized Boltzmann-Gibbs-Shannon entropy (Q1240712) (← links)
- Characterizations of a discrete normal distribution (Q1372406) (← links)
- Representing preorders with injective monotones (Q2084936) (← links)
- Entropic mobility index as a measure of (in)equality of opportunity (Q2208854) (← links)
- A Comparative Assessment of Various Measures of Entropy (Q3039231) (← links)
- MEASURES OF UNCERTAINTY AND INFORMATION BASED ON POSSIBILITY DISTRIBUTIONS (Q3962915) (← links)
- (Q4160821) (← links)
- Lectures on Entropy. I: Information-Theoretic Notions (Q4969610) (← links)
- A Medida de Informação de Shannon: Entropia (Q5074105) (← links)
- Informational separability and entropy (Q6203352) (← links)