Source coding with escort distributions and Rényi entropy bounds
From MaRDI portal
(Redirected from Publication:665308)
Abstract: We discuss the interest of escort distributions and R'enyi entropy in the context of source coding. We first recall a source coding theorem by Campbell relating a generalized measure of length to the R'enyi-Tsallis entropy. We show that the associated optimal codes can be obtained using considerations on escort-distributions. We propose a new family of measure of length involving escort-distributions and we show that these generalized lengths are also bounded below by the R'enyi entropy. Furthermore, we obtain that the standard Shannon codes lengths are optimum for the new generalized lengths measures, whatever the entropic index. Finally, we show that there exists in this setting an interplay between standard and escort distributions.
Recommendations
- On a generalized entropy and a coding theorem
- A joint representation of Rényi's and Tsalli's entropy with application in coding theory
- Some coding theorem connected on generalized Renyi's entropy for incomplete power probability distribution \(p^\beta\)
- A coding theorem on generalized \(R\)-norm entropy
- Relative uniformity of sources and the comparison of optimal code costs
Cites work
- scientific article; zbMATH DE number 1912121 (Why is no real title available?)
- A coding theorem and Rényi's entropy
- Distributions and channel capacities in generalized statistical mechanics
- Dual description of nonextensive ensembles
- Introduction to Nonextensive Statistical Mechanics
- Optimal Prefix Codes for Infinite Alphabets With Nonlinear Costs
- Source Coding for Quasiarithmetic Penalties
- The world according to Rényi: Thermodynamics of multifractal systems
- Tsallis distribution as a standard maximum entropy solution with `tail' constraint
Cited in
(10)- Optimal guessing under nonextensive framework and associated moment bounds
- Differential-escort transformations and the monotonicity of the LMC-Rényi complexity measure
- Optimal information, Jensen-RIG function and \(\alpha\)-Onicescu's correlation coefficient in terms of information generating functions
- Entropy and Source Coding for Integer-Dimensional Singular Random Variables
- Comparison of transfer entropy methods for financial time series
- Simple one-shot bounds for various source coding problems using smooth Rényi quantities
- A fixed-length source coding theorem on quasi-probability space
- Upper bounds on Shannon and Rényi entropies for central potentials
- Entropy approximation in lossy source coding problem
- Tsallis entropy measure of noise-aided information transmission in a binary channel
This page was built for publication: Source coding with escort distributions and Rényi entropy bounds
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q665308)