Pages that link to "Item:Q845363"
From MaRDI portal
The following pages link to Axiomatic characterizations of information measures (Q845363):
Displaying 50 items.
- Highly symmetric POVMs and their informational power (Q265425) (← links)
- Objective Bayesianism and the maximum entropy principle (Q280547) (← links)
- Informations in models of evolutionary dynamics (Q290489) (← links)
- On the connections of generalized entropies with Shannon and Kolmogorov-Sinai entropies (Q296289) (← links)
- A foundational approach to generalising the maximum entropy inference process to the multi-agent context (Q296517) (← links)
- Dependence assessment based on generalized relative complexity: application to sampling network design (Q340137) (← links)
- Bounds of the Pinsker and Fannes types on the Tsallis relative entropy (Q372900) (← links)
- A variational framework for exemplar-based image inpainting (Q408937) (← links)
- The value of information for populations in varying environments (Q540576) (← links)
- Some general properties of unified entropies (Q635784) (← links)
- New perspectives on multilocus ancestry informativeness (Q669102) (← links)
- Asymptotic behavior of the maximum entropy routing in computer networks (Q742656) (← links)
- An entropy-based weighted concept lattice for merging multi-source geo-ontologies (Q742768) (← links)
- Probabilism, entropies and strictly proper scoring rules (Q899128) (← links)
- On local Tsallis entropy of relative dynamical systems (Q1626193) (← links)
- Adaptive decision making via entropy minimization (Q1726294) (← links)
- Copula theory and probabilistic sensitivity analysis: is there a connection? (Q1740560) (← links)
- Tsallis entropy of partitions in quantum logics (Q1741028) (← links)
- Generalized Shannon-Khinchin axioms and uniqueness theorem for pseudo-additive entropies (Q1783027) (← links)
- On quantum conditional entropies defined in terms of the \(F\)-divergences (Q2018741) (← links)
- Aleatoric and epistemic uncertainty in machine learning: an introduction to concepts and methods (Q2051254) (← links)
- Aggregating incoherent agents who disagree (Q2053354) (← links)
- Conjugate predictive distributions and generalized entropies (Q2080167) (← links)
- Representing preorders with injective monotones (Q2084936) (← links)
- Variety of evidence and the elimination of hypotheses (Q2096144) (← links)
- A triple uniqueness of the maximum entropy approach (Q2146064) (← links)
- Unidirectional random growth with resetting (Q2150365) (← links)
- On the weighted Gini-Simpson index: estimating feasible weights using the optimal point and discussing a link with possibility theory (Q2154317) (← links)
- Equitability, interval estimation, and statistical power (Q2218039) (← links)
- Measuring diversity in heterogeneous information networks (Q2227496) (← links)
- Probability as a measure of information added (Q2255192) (← links)
- Tsallis entropy and generalized Shannon additivity (Q2275124) (← links)
- Law invariant risk measures and information divergences (Q2283649) (← links)
- Inflation as an information bottleneck: a strategy for identifying universality classes and making robust predictions (Q2315702) (← links)
- An axiomatization of information flow measures (Q2422017) (← links)
- Updating a progic (Q2634492) (← links)
- Processing distortion models: a comparative study (Q2671749) (← links)
- Rules of proof for maximal entropy inference (Q2677855) (← links)
- Determining maximal entropy functions for objective Bayesian inductive logic (Q2698549) (← links)
- (Q2907892) (← links)
- The General Form of γ-Family of Quantum Relative Entropies (Q3087518) (← links)
- Inequalities for Tsallis relative entropy and generalized skew information (Q3104550) (← links)
- Coherence quantifiers from the viewpoint of their decreases in the measurement process (Q3120001) (← links)
- Rényi entropy of the totally asymmetric exclusion process (Q4599345) (← links)
- A short characterization of relative entropy (Q4628749) (← links)
- Quantum conditional relative entropy and quasi-factorization of the relative entropy (Q4629597) (← links)
- Lectures on Entropy. I: Information-Theoretic Notions (Q4969610) (← links)
- A Medida de Informação de Shannon: Entropia (Q5074105) (← links)
- Prediction in Riemannian metrics derived from divergence functions (Q5079256) (← links)
- Additive Scoring Rules for Discrete Sample Spaces (Q5120278) (← links)