Some properties of generalized exponential entropies with applications to data compression
From MaRDI portal
Publication:1187212
DOI10.1016/0020-0255(92)90027-6zbMath0746.94006OpenAlexW2055027776MaRDI QIDQ1187212
Publication date: 28 June 1992
Published in: Information Sciences (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1016/0020-0255(92)90027-6
data compressionprobability distributiondifferential entropiesexponential entropiesexponential families of distributions
Related Items (7)
Minimum entropy of error principle in estimation ⋮ On generalized Gini means and scales of means ⋮ Minimum Rényi entropy portfolios ⋮ Some source coding theorems and 1:1 coding based on generalized inaccuracy measure of order \(\alpha \) and type \(\beta \) ⋮ Some new fuzzy entropy formulas ⋮ Development of two new mean codeword lengths ⋮ Some Results on Dynamic Generalized Survival Entropy
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- The relative information generating function
- On Jessen's inequality for convex functions. II
- Large deviations of the sample mean in general vector spaces
- Gaussian measures in Banach spaces
- Continuity and characterization of Shannon-Wiener information measure for continuous probability distributions
- A Note on Entropy
- A note on the entropy of a continuous distribution
- Two Remarks on the Basic Theorems of Information Theory.
- Some remarks on the dimension and entropy of random variables
- ON SOME INEQUALITIES AND GENERALIZED ENTROPIES: A UNIFIED APPROAC
- Bounds on the entropy series
- NEW RESULTS ON THE ENTROPY OF DETERMINISTIC MAPS: APPLICATIONS TO LIAPUNOV EXPONENT, DIVERGENCE, COMPOSITIONAL INFERENCE, AND PATTERN ENCODING
- The Detection of Signals in Impulsive Noise Modeled as a Mixture Process
- Exponential entropy as a measure of extent of a distribution
- Epsilon Entropy of Stochastic Processes
- Detectors for discrete-time signals in non-Gaussian noise
- Epsilon Entropy and Data Compression
- An upper bound on the entropy series
- An Inversion Theorem and Generalized Entropies for Continuous Distributions
- Measuring information beyond communication theory. Some probably useful and some almost certainly useless generalizations
This page was built for publication: Some properties of generalized exponential entropies with applications to data compression