Pages that link to "Item:Q1233423"
From MaRDI portal
The following pages link to On measures of information and their characterizations (Q1233423):
Displaying 50 items.
- Optimal quantization of the support of a continuous multivariate distribution based on mutual information (Q269123) (← links)
- Objective Bayesianism and the maximum entropy principle (Q280547) (← links)
- The generalized fundamental equation of information on symmetric cones (Q329384) (← links)
- Dependence assessment based on generalized relative complexity: application to sampling network design (Q340137) (← links)
- A characterization of entropy in terms of information loss (Q400959) (← links)
- Image comparison by compound disjoint information with applications to perceptual visual quality assessment, image registration and tracking (Q408798) (← links)
- Rank discrimination measures for enforcing monotonicity in decision tree induction (Q508805) (← links)
- Harnessing inequality (Q521777) (← links)
- Limited width parallel prefix circuits (Q547485) (← links)
- Optimal vector quantization in terms of Wasserstein distance (Q550175) (← links)
- On the stability of the modified entropy equation (Q606389) (← links)
- Can the maximum entropy principle be explained as a consistency requirement? (Q639753) (← links)
- A structure theorem for sum form functional equations (Q675215) (← links)
- Pseudo information entropy of a single trapped ion interacting with a laser field (Q707379) (← links)
- Geometry of distributions associated with Tsallis statistics and properties of relative entropy minimization (Q715962) (← links)
- The Shannon kernel of a non-negative information function (Q787953) (← links)
- Measuring information beyond communication theory - why some generalized information measures may be useful, others not (Q799647) (← links)
- Utility of gambling when events are valued: An application of inset entropy (Q836040) (← links)
- Minimum variance capacity identification (Q856242) (← links)
- Entropy of bi-capacities (Q857374) (← links)
- Towards a unifying approach to diversity measures: bridging the gap between the Shannon entropy and Rao's quadratic index (Q884282) (← links)
- Discrimination power of graph measures based on complex zeros of the partial Hosoya polynomial (Q902842) (← links)
- Branching inset entropies on open domains (Q923047) (← links)
- Purity, resistance, and innocence in utility theory (Q928766) (← links)
- Utility of gambling. II: Risk, paradoxes, and data (Q929347) (← links)
- The stability of the entropy of degree alpha (Q933462) (← links)
- Measures of uncertainty for imprecise probabilities: an axiomatic approach (Q985132) (← links)
- Bhattacharyya statistical divergence of quantum observables (Q1005513) (← links)
- On a functional equation in connection with information theory (Q1050552) (← links)
- Determination of all semisymmetric recursive information measures of multiplicative type on n positive discrete probability distributions (Q1053186) (← links)
- The role of functional equations in stochastic model building (Q1053378) (← links)
- A mixed theory of information. VIII: Inset measures depending upon several distributions (Q1055749) (← links)
- Complexity of compartmental models (Q1066826) (← links)
- A modification of the Whittaker-Kotelnikov-Shannon sampling series (Q1069074) (← links)
- On a coding theorem connected with entropy of order \(\alpha\) and type \(\beta\) (Q1069903) (← links)
- On regular solutions of functional equations (Q1074039) (← links)
- On symmetry and the directed divergence in information theory (Q1081584) (← links)
- The relation between information theory and the differential geometry approach to statistics (Q1086923) (← links)
- Quasicyclic symmetry and the directed divergence in information theory (Q1089308) (← links)
- A mixed theory of information. X: Information functions and information measures (Q1094387) (← links)
- On the uniqueness of possibilistic measure of uncertainty and information (Q1095876) (← links)
- A Minkowski theory of observation: Application to uncertainty and fuzziness (Q1096583) (← links)
- Where do we stand on measures of uncertainty, ambiguity, fuzziness, and the like? (Q1096591) (← links)
- On the entropy of \(\lambda\)-additive fuzzy measures (Q1105072) (← links)
- On measures on fuzziness (Q1112793) (← links)
- On a generalization of the Shannon functional inequality (Q1121850) (← links)
- A mixed theory of information. IV: Inset-inaccuracy and directed divergence (Q1139559) (← links)
- A mixed theory of information. V: How to keep the (inset) expert honest (Q1143989) (← links)
- On the entropy function of degree \(\beta\) (Q1144544) (← links)
- Nonprobabilistic entropies and indetermination measures in the setting of fuzzy sets theory (Q1150354) (← links)