Axiomatic characterizations of information measures

From MaRDI portal
Publication:845363

DOI10.3390/e10030261zbMath1179.94043OpenAlexW2149570649MaRDI QIDQ845363

Imre Csiszár

Publication date: 29 January 2010

Published in: Entropy (Search for Journal in Brave)

Full work available at URL: https://doi.org/10.3390/e10030261




Related Items (61)

Highly symmetric POVMs and their informational powerMulti-class and Cluster Evaluation Measures Based on Rényi and Tsallis Entropies and Mutual InformationPath-Based Divergence Rates and Lagrangian Uncertainty in Stochastic FlowsObjective Bayesianism and the maximum entropy principleOn local Tsallis entropy of relative dynamical systemsInformations in models of evolutionary dynamicsOn the connections of generalized entropies with Shannon and Kolmogorov-Sinai entropiesA foundational approach to generalising the maximum entropy inference process to the multi-agent contextUpdating a progicA triple uniqueness of the maximum entropy approachUnidirectional random growth with resettingOn the weighted Gini-Simpson index: estimating feasible weights using the optimal point and discussing a link with possibility theoryDependence assessment based on generalized relative complexity: application to sampling network designA Medida de Informação de Shannon: EntropiaPrediction in Riemannian metrics derived from divergence functionsBounds of the Pinsker and Fannes types on the Tsallis relative entropyProcessing distortion models: a comparative studyLogical perspectives on the foundations of probabilityRules of proof for maximal entropy inferenceMeasures of conflict, basic axioms and their application to the clusterization of a body of evidenceAn axiomatization of information flow measuresA variational framework for exemplar-based image inpaintingProbabilism, entropies and strictly proper scoring rulesRényi entropy of the totally asymmetric exclusion processContinuity and additivity properties of information decompositionsCategorical magnitude and entropySome general properties of unified entropiesAdditive Scoring Rules for Discrete Sample SpacesDetermining maximal entropy functions for objective Bayesian inductive logicEquitability, interval estimation, and statistical powerAdaptive decision making via entropy minimizationA short characterization of relative entropyQuantum conditional relative entropy and quasi-factorization of the relative entropyMeasuring diversity in heterogeneous information networksNew perspectives on multilocus ancestry informativenessCopula theory and probabilistic sensitivity analysis: is there a connection?Tsallis entropy of partitions in quantum logicsLagrangian Uncertainty Quantification and Information Inequalities for Stochastic FlowsProbability as a measure of information addedThe value of information for populations in varying environmentsGeneralized Conditional Entropy — Determinicity of a Process and Rokhlin's FormulaOn quantum conditional entropies defined in terms of the \(F\)-divergencesTsallis entropy and generalized Shannon additivityGeneralized Shannon-Khinchin axioms and uniqueness theorem for pseudo-additive entropiesLaw invariant risk measures and information divergencesAsymptotic behavior of the maximum entropy routing in computer networksAn entropy-based weighted concept lattice for merging multi-source geo-ontologiesAleatoric and epistemic uncertainty in machine learning: an introduction to concepts and methodsLectures on Entropy. I: Information-Theoretic NotionsAggregating incoherent agents who disagreeThe General Form of γ-Family of Quantum Relative EntropiesUnnamed ItemInflation as an information bottleneck: a strategy for identifying universality classes and making robust predictionsInequalities for Tsallis relative entropy and generalized skew informationConjugate predictive distributions and generalized entropiesFurther results on generalized conditional entropiesRepresenting preorders with injective monotonesCoherence quantifiers from the viewpoint of their decreases in the measurement processVariety of evidence and the elimination of hypothesesAn axiomatic characterization of a two-parameter extended relative entropyModerating probability distributions for unrepresented uncertainty: Application to sentiment analysis via deep learning


Uses Software



This page was built for publication: Axiomatic characterizations of information measures