On measures of information and their characterizations
From MaRDI portal
Publication:1233423
zbMath0345.94022MaRDI QIDQ1233423
Publication date: 1975
Published in: Mathematics in Science and Engineering (Search for Journal in Brave)
Goldbach-type theorems; other additive questions involving primes (11P32) Research exposition (monographs, survey articles) pertaining to information and communication theory (94-02) Information theory (general) (94A15) General theory of functional equations and inequalities (39B05) Functional equations and inequalities (39B99)
Related Items
Optimal quantization of the support of a continuous multivariate distribution based on mutual information ⋮ On the role of the \(\kappa\)-deformed Kaniadakis distribution in nonlinear plasma waves ⋮ Entropic measures of joint uncertainty: effects of lack of majorization ⋮ On symmetry and the directed divergence in information theory ⋮ Entropic properties of \(D\)-dimensional Rydberg systems ⋮ Inequality spectra ⋮ Resolving contradictions: A plausible semantics for inconsistent systems ⋮ Objective Bayesianism and the maximum entropy principle ⋮ Multiscale Rényi cumulative residual distribution entropy: reliability analysis of financial time series ⋮ The relation between information theory and the differential geometry approach to statistics ⋮ Diversity partitioning of Rao's quadratic entropy ⋮ Utility of gambling when events are valued: An application of inset entropy ⋮ Quasicyclic symmetry and the directed divergence in information theory ⋮ A mixed theory of information. X: Information functions and information measures ⋮ On the uniqueness of possibilistic measure of uncertainty and information ⋮ A Minkowski theory of observation: Application to uncertainty and fuzziness ⋮ Where do we stand on measures of uncertainty, ambiguity, fuzziness, and the like? ⋮ Investigating equality: the Rényi spectrum ⋮ Unidirectional random growth with resetting ⋮ The generalized fundamental equation of information on symmetric cones ⋮ Measures of entropy and fuzziness related to aggregation operators ⋮ A new weighted \((\alpha, \beta)\)-norm information measure with application in coding theory ⋮ On the entropy of \(\lambda\)-additive fuzzy measures ⋮ Minimum variance capacity identification ⋮ Entropy of bi-capacities ⋮ Dependence assessment based on generalized relative complexity: application to sampling network design ⋮ Disagreement degree of multi-person judgemenets in an additive structure ⋮ Fundamental equation of information revisited ⋮ On measures on fuzziness ⋮ Inequalities related to some types of entropies and divergences ⋮ Paul Erdős on functional equations: Contributions and impact ⋮ On a generalization of the Shannon functional inequality ⋮ On parametric evenness measures ⋮ On the stability of the modified entropy equation ⋮ Towards a unifying approach to diversity measures: bridging the gap between the Shannon entropy and Rao's quadratic index ⋮ An entropic form for NLFP with coulombic-like potential ⋮ A characterization of entropy in terms of information loss ⋮ A mixed theory of information. IV: Inset-inaccuracy and directed divergence ⋮ Image comparison by compound disjoint information with applications to perceptual visual quality assessment, image registration and tracking ⋮ A mixed theory of information. V: How to keep the (inset) expert honest ⋮ On the entropy function of degree \(\beta\) ⋮ On the concept of best achievable compression ratio for lossy image coding. ⋮ Discrimination power of graph measures based on complex zeros of the partial Hosoya polynomial ⋮ Nonprobabilistic entropies and indetermination measures in the setting of fuzzy sets theory ⋮ On the Shannon measure of entropy ⋮ Symmetry and the Shannon entropy ⋮ Monotonically equivalent entropies and solution of additivity equation ⋮ The general solution of a functional equation related to the mixed theory of information ⋮ A generalization of Bellman's functional equation ⋮ Can the maximum entropy principle be explained as a consistency requirement? ⋮ Cyclic symmetry and the Shannon entropy ⋮ An intuitionistic fuzzy \((\delta , \gamma )\)-norm entropy with its application in supplier selection problem ⋮ Branching inset entropies on open domains ⋮ A new information-theoretic approach to the entropy of non-random discrete maps relation to fractal dimension and temperature of curves ⋮ A generalization of the Havrda-Charvat and Tsallis entropy and its axiomatic characterization ⋮ Purity, resistance, and innocence in utility theory ⋮ Utility of gambling. II: Risk, paradoxes, and data ⋮ The stability of the entropy of degree alpha ⋮ A structure theorem for sum form functional equations ⋮ A tour of inequality ⋮ Rank discrimination measures for enforcing monotonicity in decision tree induction ⋮ On paraconcave entropy functions ⋮ An \((R',S')\)-norm fuzzy relative information measure and its applications in strategic decision-making ⋮ An (\(R\),\(S\))-norm fuzzy information measure with its applications in multiple-attribute decision-making ⋮ Harnessing inequality ⋮ Derivation of an amplitude of information in the setting of a new family of fractional entropies ⋮ Limited width parallel prefix circuits ⋮ Optimal vector quantization in terms of Wasserstein distance ⋮ Pseudo information entropy of a single trapped ion interacting with a laser field ⋮ Geometry of distributions associated with Tsallis statistics and properties of relative entropy minimization ⋮ Measures of uncertainty for imprecise probabilities: an axiomatic approach ⋮ A Fokker-Planck equation for a piecewise entropy functional defined in different space domains. an application to solute partitioning at the membrane-water interface ⋮ Generalized Shannon-Khinchin axioms and uniqueness theorem for pseudo-additive entropies ⋮ Of bits and wows: a Bayesian theory of surprise with applications to attention ⋮ Constructing blockmodels: how and why ⋮ Functional equations on restricted domains ⋮ Bhattacharyya statistical divergence of quantum observables ⋮ Continued fractions and the fundamental equation of information ⋮ Modeling protein cores with Markov random fields ⋮ On the characterization of quasiarithmetic means with weight function ⋮ On the dual of linear inverse problems ⋮ The characterization of a measure of information discrepancy ⋮ On nonlinear and nonextensive diffusion and the second law of thermodynamics ⋮ The Shannon kernel of a non-negative information function ⋮ On a functional equation in connection with information theory ⋮ Determination of all semisymmetric recursive information measures of multiplicative type on n positive discrete probability distributions ⋮ The role of functional equations in stochastic model building ⋮ The fuzzy hyperbolic inequality index associated with fuzzy random variables ⋮ Gaussian clustering method based on maximum-fuzzy-entropy interpretation ⋮ A mixed theory of information. VIII: Inset measures depending upon several distributions ⋮ Measuring information beyond communication theory - why some generalized information measures may be useful, others not ⋮ Entropy of discrete Choquet capacities ⋮ Families of OWA operators ⋮ Complexity of compartmental models ⋮ A modification of the Whittaker-Kotelnikov-Shannon sampling series ⋮ On a coding theorem connected with entropy of order \(\alpha\) and type \(\beta\) ⋮ Solution of (2,2)-type sum form functional equations with several unknown functions ⋮ Measurable solutions of a (2,2)-type sum form functional equation ⋮ On regular solutions of functional equations ⋮ Information cohomology of classical vector-valued observables ⋮ On determining a representation for measures of entropy possessing branching property without assuming symmetry ⋮ On some sum form functional equations related to information theory ⋮ Entropy conserving probability transforms and the entailment principle ⋮ On a problem of T. Szostok concerning the Hermite-Hadamard inequalities ⋮ Maximum L\(q\)-likelihood estimation ⋮ General representation theorems for efficient population behavior ⋮ Generalized information theory ⋮ Some information theoretic ideas useful in statistical inference ⋮ Branching and Generalized-Recursive Inset Entropies ⋮ A Medida de Informação de Shannon: Entropia ⋮ Functional equations with division and regular operations ⋮ A FEW REMARKS ON MEASURES OF UNCERTAINTY IN DEMPSTER-SHAFER THEORY1 ⋮ Unnamed Item ⋮ Unnamed Item ⋮ Unnamed Item ⋮ Unnamed Item ⋮ CONCEPTS OF FUZZY INFORMATION MEASURES ON CONTINUOUS DOMAINS ⋮ Unnamed Item ⋮ Unnamed Item ⋮ On entropy production of repeated quantum measurements. I. General theory ⋮ Unnamed Item ⋮ Maximum entropy derived and generalized under idempotent probability to address Bayes-frequentist uncertainty and model revision uncertainty: an information-theoretic semantics for possibility theory ⋮ Entropic uncertainty measures for large dimensional hydrogenic systems ⋮ On extreme values of the Rényi entropy under coupling of probability distributions ⋮ On the maximum values of \(f\)-divergence and Rényi divergence under a given variational distance ⋮ Refined estimations for some types of entropies and divergences ⋮ Lossy compression approach to subspace clustering ⋮ Unnamed Item ⋮ An axiomatic characterization of the Theil inequality ordering ⋮ Additive Scoring Rules for Discrete Sample Spaces ⋮ Entropy and monotonicity in artificial intelligence ⋮ Unnamed Item ⋮ A functional equation related to generalized entropies and the modular group ⋮ Entanglement rates for Rényi, Tsallis, and other entropies ⋮ EXTENDED ENTROPIES AND DISORDER ⋮ Measuring diversity in heterogeneous information networks ⋮ Rényi entropies for multidimensional hydrogenic systems in position and momentum spaces ⋮ Gelfand-Yaglom-Perez theorem for generalized relative entropy functionals ⋮ On the representation of measures over bounded lattices ⋮ MEASUREMENTS AND INFORMATION IN SPIN FOAM MODELS ⋮ An entropy group and its representation in thermodynamics of nonextensive systems ⋮ Regularity properties of functional equations ⋮ An extension theorem ⋮ Axioms for \((\alpha, \beta, \gamma)\)-entropy of a generalized probability scheme ⋮ Determination and interpretation of preferred orientation with texture goniometry: An application of indicators to maximum entropy pole- to orientation-density inversion ⋮ Measurement analogies: comparisons of behavioral and physical measures ⋮ Measurable Solutions of Functional Equations Related to Information Theory ⋮ On measurable solutions of a functional equation and their application to information theory ⋮ Marginal probability distribution determined by the maximum entropy method. ⋮ The determination of bounds of the \(\beta\)-entropic sum of two noncommuting observables ⋮ Information cost functions ⋮ Charakterisierung dynamischer leontief- systeme bei unbestimmten koeffizientenmatrizen ⋮ On the measurable solutions of some functional equations useful in information theory ⋮ Non-parametric estimation of Kullback-Leibler discrimination information based on censored data ⋮ Approach of complexity in nature: entropic nonuniqueness ⋮ Unnamed Item ⋮ A joint Shannon cipher and privacy amplification approach to attaining exponentially decaying information leakage ⋮ Unsupervized aggregation of commensurate correlated attributes by means of the choquet integral and entropy functionals ⋮ A \(q\)-parameter bound for particle spectra based on black hole thermodynamics with Rényi entropy ⋮ Axiomatics for the mean using Bemporad's condition ⋮ Lectures on Entropy. I: Information-Theoretic Notions ⋮ Fuzzy set methods for uncertainty management in intelligence analysis ⋮ An entropy group and its representation in thermodynamics of nonextensive systems ⋮ Unnamed Item ⋮ On some optimization problems for the Rényi divergence ⋮ First degree-based entropy of graphs ⋮ An intuitionistic fuzzy information measure of order-\((\alpha, \beta)\) with a new approach in supplier selection problems using an extended VIKOR method ⋮ Über eine Klasse von Informationsmaßen für die Bewertung stochastischer (partieller) Informationen ⋮ Unnamed Item ⋮ Use and Applications of Non-Additive Measures and Integrals ⋮ An axiomatic approach to the definition of the entropy of a discrete Choquet capacity ⋮ OWA operators with maximal Rényi entropy ⋮ Generalized degree-based graph entropies ⋮ Some Functional Equations Related to the Characterizations of Information Measures and Their Stability ⋮ Evolution of the entropy and Renyi difference information during self-organization of open additive systems ⋮ Some derivations of the Shannon entropy ⋮ Unnamed Item ⋮ An axiomatic characterization of a two-parameter extended relative entropy ⋮ Information theoretical properties of Tsallis entropies ⋮ Upper continuity bounds on the relative q-entropy for q > 1 ⋮ Characterizing ring derivations of all orders via functional equations: results and open problems ⋮ On the joint convexity of the Bregman divergence of matrices ⋮ EXTENSIONS OF THE SHANNON ENTROPY AND THE CHAOS GAME ALGORITHM TO HYPERBOLIC NUMBERS PLANE ⋮ Fuzzy Entropy Measure with an Applications in Decision Making Under Bipolar Fuzzy Environment based on TOPSIS Method ⋮ The fractional Kullback–Leibler divergence