Encoding Tasks and Rényi Entropy
From MaRDI portal
Publication:2986130
DOI10.1109/TIT.2014.2329490zbMATH Open1360.94145arXiv1401.6338OpenAlexW3103345930MaRDI QIDQ2986130FDOQ2986130
Authors: Christoph Bunte, Amos Lapidoth
Publication date: 16 May 2017
Published in: IEEE Transactions on Information Theory (Search for Journal in Brave)
Abstract: A task is randomly drawn from a finite set of tasks and is described using a fixed number of bits. All the tasks that share its description must be performed. Upper and lower bounds on the minimum -th moment of the number of performed tasks are derived. The case where a sequence of tasks is produced by a source and tasks are jointly described using bits is considered. If is larger than the R'enyi entropy rate of the source of order (provided it exists), then the -th moment of the ratio of performed tasks to can be driven to one as tends to infinity. If is smaller than the R'enyi entropy rate, this moment tends to infinity. The results are generalized to account for the presence of side-information. In this more general setting, the key quantity is a conditional version of R'enyi entropy that was introduced by Arimoto. For IID sources two additional extensions are solved, one of a rate-distortion flavor and the other where different tasks may have different nonnegative costs. Finally, a divergence that was identified by Sundaresan as a mismatch penalty in the Massey-Arikan guessing problem is shown to play a similar role here.
Full work available at URL: https://arxiv.org/abs/1401.6338
Recommendations
- scientific article; zbMATH DE number 5208524
- scientific article; zbMATH DE number 1092006
- scientific article; zbMATH DE number 3920365
- The complexity of estimating Rényi entropy
- Generalized entropies in coding theory
- A joint representation of Rényi's and Tsalli's entropy with application in coding theory
- Entropy, universal coding, approximation, and bases properties
- Renyi entropy estimation revisited
- Entropy lower bounds related to a problem of universal coding and prediction
- On the Conditional Smooth Rényi Entropy and its Applications in Guessing and Source Coding
Cited In (7)
- Cramér-Rao lower bounds arising from generalized Csiszár divergences
- Minimization with respect to divergences and applications
- Entropy preservation under Markov coding.
- Entropy encoding, Hilbert space, and Karhunen-Loève transforms
- Projection theorems and estimating equations for power-law models
- A unified approach to the Pythagorean identity and projection theorem for a class of divergences based on M-estimations
- Title not available (Why is that?)
This page was built for publication: Encoding Tasks and Rényi Entropy
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2986130)