Variable-Length Source Dispersions Differ Under Maximum and Average Error Criteria

From MaRDI portal
Publication:5138917

DOI10.1109/TIT.2020.3019062zbMATH Open1457.94074arXiv1910.05724OpenAlexW3098890812MaRDI QIDQ5138917FDOQ5138917

Yuta Sakai, Vincent Y. F. Tan

Publication date: 4 December 2020

Published in: IEEE Transactions on Information Theory (Search for Journal in Brave)

Abstract: Variable-length compression without prefix-free constraints and with side-information available at both encoder and decoder is considered. Instead of requiring the code to be error-free, we allow for it to have a non-vanishing error probability. We derive one-shot bounds on the optimal average codeword length by proposing two new information quantities; namely, the conditional and unconditional varepsilon-cutoff entropies. Using these one-shot bounds, we obtain the second-order asymptotics of the problem under two different formalisms---the average and maximum probabilities of error over the realization of the side-information. While the first-order terms in the asymptotic expansions for both formalisms are identical, we find that the source dispersion under the average error formalism is, in most cases, strictly smaller than its maximum error counterpart. Applications to a certain class of guessing problems, previously studied by Kuzuoka [emph{{IEEE} Trans. Inf. Theory}, vol.~66, no.~3, pp.~1674--1690, 2020], are also discussed.


Full work available at URL: https://arxiv.org/abs/1910.05724






Cited In (3)






This page was built for publication: Variable-Length Source Dispersions Differ Under Maximum and Average Error Criteria

Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q5138917)