Encoding Tasks and Rényi Entropy
From MaRDI portal
Publication:2986130
Abstract: A task is randomly drawn from a finite set of tasks and is described using a fixed number of bits. All the tasks that share its description must be performed. Upper and lower bounds on the minimum -th moment of the number of performed tasks are derived. The case where a sequence of tasks is produced by a source and tasks are jointly described using bits is considered. If is larger than the R'enyi entropy rate of the source of order (provided it exists), then the -th moment of the ratio of performed tasks to can be driven to one as tends to infinity. If is smaller than the R'enyi entropy rate, this moment tends to infinity. The results are generalized to account for the presence of side-information. In this more general setting, the key quantity is a conditional version of R'enyi entropy that was introduced by Arimoto. For IID sources two additional extensions are solved, one of a rate-distortion flavor and the other where different tasks may have different nonnegative costs. Finally, a divergence that was identified by Sundaresan as a mismatch penalty in the Massey-Arikan guessing problem is shown to play a similar role here.
Recommendations
- scientific article; zbMATH DE number 5208524
- scientific article; zbMATH DE number 1092006
- scientific article; zbMATH DE number 3920365
- The complexity of estimating Rényi entropy
- Generalized entropies in coding theory
- A joint representation of Rényi's and Tsalli's entropy with application in coding theory
- Entropy, universal coding, approximation, and bases properties
- Renyi entropy estimation revisited
- Entropy lower bounds related to a problem of universal coding and prediction
- On the Conditional Smooth Rényi Entropy and its Applications in Guessing and Source Coding
Cited in
(7)- A unified approach to the Pythagorean identity and projection theorem for a class of divergences based on M-estimations
- scientific article; zbMATH DE number 5208524 (Why is no real title available?)
- Cramér-Rao lower bounds arising from generalized Csiszár divergences
- Minimization with respect to divergences and applications
- Projection theorems and estimating equations for power-law models
- Entropy preservation under Markov coding.
- Entropy encoding, Hilbert space, and Karhunen-Loève transforms
This page was built for publication: Encoding Tasks and Rényi Entropy
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2986130)