ON SOME INEQUALITIES AND GENERALIZED ENTROPIES: A UNIFIED APPROAC
From MaRDI portal
Publication:3728800
DOI10.1080/01969728508927780zbMath0596.94005OpenAlexW1985273796MaRDI QIDQ3728800
Inder Jeet Taneja, Renato M. Capocelli
Publication date: 1985
Published in: Cybernetics and Systems (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1080/01969728508927780
Shannon entropyentropies of degree (alpha,beta)entropies of order alpha and degree betainequalities for useful measures of information
Measures of information, entropy (94A17) Convexity of real functions of several variables, generalizations (26B25)
Related Items
Characterizations of sum form information measures on open domains ⋮ Some properties of generalized exponential entropies with applications to data compression ⋮ A generalized model for the analysis of association in ordinal contingency tables ⋮ A fuzzy probabilistic information system comparison criterion: Applications and properties
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- A Mathematical Theory of Communication
- Entropy differential metric, distance and divergence measures in probability spaces: A unified approach
- Information in experiments and sufficiency
- Entropy of type \((\alpha,\beta)\) and other generalized measures in information theory
- On the inequality \(\sum p_if(p_i) \geq \sum p_if(q_i)\).
- On continuous solutions of a functional inequality
- New parametric measures of information
- Comments on “entropies of degree β and lower bounds for the average error rate”
- Conditions for Optimality of the Huffman Algorithm
- Almost total information convergence of order<tex>alpha</tex>for a sequence of point processes (Corresp.)
- Tight lower bounds for optimum code length (Corresp.)
- On the convexity of some divergence measures based on entropy functions
- Universal codeword sets and representations of the integers
- Computation of random coding exponent functions
- Entropies of degree β and lower bounds for the average error rate
- Some equivalences between Shannon entropy and Kolmogorov complexity
- f-entropies, probability of error, and feature selection
- Renyi's entropy and the probability of error
- Variable-length source coding with a cost depending only on the code word length
- Variations on a theme by Huffman
- Nonadditive measures of average charge for heterogeneous questionnaires
- Combinatorial Merging and Huffman's Algorithm
- Fuzzy sets and decision theory
- On the characterization of Shannon's entropy by Shannon's inequality
- Order preserving measures of information
- A simple derivation of the coding theorem and some applications
- A coding theorem and Rényi's entropy
- On the Foundations of Information Theory
- Buffer overflow in variable length coding of fixed rate sources
- Definition of entropy by means of a coding problem
- Information radius
- Information-theoretical considerations on estimation problems
- An upper bound on the entropy series
- A definition of a nonprobabilistic entropy in the setting of fuzzy sets theory
- On variable-length-to-block coding
- On Information and Sufficiency
- Inequalities: theory of majorization and its applications
This page was built for publication: ON SOME INEQUALITIES AND GENERALIZED ENTROPIES: A UNIFIED APPROAC