Axiomatic characterizations of information measures
From MaRDI portal
Publication:845363
DOI10.3390/e10030261zbMath1179.94043OpenAlexW2149570649MaRDI QIDQ845363
Publication date: 29 January 2010
Published in: Entropy (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.3390/e10030261
functional equationmaximum entropyShannon entropyBregman distanceproper score\(f\)- divergence\(f\)-entropyKullback I-divergenceRényi information measurestransitive inference rule
Related Items (61)
Highly symmetric POVMs and their informational power ⋮ Multi-class and Cluster Evaluation Measures Based on Rényi and Tsallis Entropies and Mutual Information ⋮ Path-Based Divergence Rates and Lagrangian Uncertainty in Stochastic Flows ⋮ Objective Bayesianism and the maximum entropy principle ⋮ On local Tsallis entropy of relative dynamical systems ⋮ Informations in models of evolutionary dynamics ⋮ On the connections of generalized entropies with Shannon and Kolmogorov-Sinai entropies ⋮ A foundational approach to generalising the maximum entropy inference process to the multi-agent context ⋮ Updating a progic ⋮ A triple uniqueness of the maximum entropy approach ⋮ Unidirectional random growth with resetting ⋮ On the weighted Gini-Simpson index: estimating feasible weights using the optimal point and discussing a link with possibility theory ⋮ Dependence assessment based on generalized relative complexity: application to sampling network design ⋮ A Medida de Informação de Shannon: Entropia ⋮ Prediction in Riemannian metrics derived from divergence functions ⋮ Bounds of the Pinsker and Fannes types on the Tsallis relative entropy ⋮ Processing distortion models: a comparative study ⋮ Logical perspectives on the foundations of probability ⋮ Rules of proof for maximal entropy inference ⋮ Measures of conflict, basic axioms and their application to the clusterization of a body of evidence ⋮ An axiomatization of information flow measures ⋮ A variational framework for exemplar-based image inpainting ⋮ Probabilism, entropies and strictly proper scoring rules ⋮ Rényi entropy of the totally asymmetric exclusion process ⋮ Continuity and additivity properties of information decompositions ⋮ Categorical magnitude and entropy ⋮ Some general properties of unified entropies ⋮ Additive Scoring Rules for Discrete Sample Spaces ⋮ Determining maximal entropy functions for objective Bayesian inductive logic ⋮ Equitability, interval estimation, and statistical power ⋮ Adaptive decision making via entropy minimization ⋮ A short characterization of relative entropy ⋮ Quantum conditional relative entropy and quasi-factorization of the relative entropy ⋮ Measuring diversity in heterogeneous information networks ⋮ New perspectives on multilocus ancestry informativeness ⋮ Copula theory and probabilistic sensitivity analysis: is there a connection? ⋮ Tsallis entropy of partitions in quantum logics ⋮ Lagrangian Uncertainty Quantification and Information Inequalities for Stochastic Flows ⋮ Probability as a measure of information added ⋮ The value of information for populations in varying environments ⋮ Generalized Conditional Entropy — Determinicity of a Process and Rokhlin's Formula ⋮ On quantum conditional entropies defined in terms of the \(F\)-divergences ⋮ Tsallis entropy and generalized Shannon additivity ⋮ Generalized Shannon-Khinchin axioms and uniqueness theorem for pseudo-additive entropies ⋮ Law invariant risk measures and information divergences ⋮ Asymptotic behavior of the maximum entropy routing in computer networks ⋮ An entropy-based weighted concept lattice for merging multi-source geo-ontologies ⋮ Aleatoric and epistemic uncertainty in machine learning: an introduction to concepts and methods ⋮ Lectures on Entropy. I: Information-Theoretic Notions ⋮ Aggregating incoherent agents who disagree ⋮ The General Form of γ-Family of Quantum Relative Entropies ⋮ Unnamed Item ⋮ Inflation as an information bottleneck: a strategy for identifying universality classes and making robust predictions ⋮ Inequalities for Tsallis relative entropy and generalized skew information ⋮ Conjugate predictive distributions and generalized entropies ⋮ Further results on generalized conditional entropies ⋮ Representing preorders with injective monotones ⋮ Coherence quantifiers from the viewpoint of their decreases in the measurement process ⋮ Variety of evidence and the elimination of hypotheses ⋮ An axiomatic characterization of a two-parameter extended relative entropy ⋮ Moderating probability distributions for unrepresented uncertainty: Application to sentiment analysis via deep learning
Uses Software
This page was built for publication: Axiomatic characterizations of information measures