Maximal Repetition and Zero Entropy Rate
From MaRDI portal
Publication:4569175
DOI10.1109/TIT.2017.2733535zbMATH Open1390.94620arXiv1609.04683OpenAlexW3101288807MaRDI QIDQ4569175FDOQ4569175
Authors: Łukasz Dębowski
Publication date: 27 June 2018
Published in: IEEE Transactions on Information Theory (Search for Journal in Brave)
Abstract: Maximal repetition of a string is the maximal length of a repeated substring. This paper investigates maximal repetition of strings drawn from stochastic processes. Strengthening previous results, two new bounds for the almost sure growth rate of maximal repetition are identified: an upper bound in terms of conditional R'enyi entropy of order given a sufficiently long past and a lower bound in terms of unconditional Shannon entropy (). Both the upper and the lower bound can be proved using an inequality for the distribution of recurrence time. We also supply an alternative proof of the lower bound which makes use of an inequality for the expectation of subword complexity. In particular, it is shown that a power-law logarithmic growth of maximal repetition with respect to the string length, recently observed for texts in natural language, may hold only if the conditional R'enyi entropy rate given a sufficiently long past equals zero. According to this observation, natural language cannot be faithfully modeled by a typical hidden Markov process, which is a class of basic language models used in computational linguistics.
Full work available at URL: https://arxiv.org/abs/1609.04683
Statistical aspects of information-theoretic topics (62B10) Measures of information, entropy (94A17) Stationary stochastic processes (60G10)
Cited In (3)
This page was built for publication: Maximal Repetition and Zero Entropy Rate
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q4569175)