Constant conditional entropy and related hypotheses

From MaRDI portal
Publication:3301633

DOI10.1088/1742-5468/2013/07/L07001zbMATH Open1456.91113arXiv1304.7359OpenAlexW2030475414MaRDI QIDQ3301633FDOQ3301633


Authors: Ramon Ferrer-i-Cancho, Łukasz Dębowski, Fermín Moscoso del Prado Martín Edit this on Wikidata


Publication date: 11 August 2020

Published in: Journal of Statistical Mechanics: Theory and Experiment (Search for Journal in Brave)

Abstract: Constant entropy rate (conditional entropies must remain constant as the sequence length increases) and uniform information density (conditional probabilities must remain constant as the sequence length increases) are two information theoretic principles that are argued to underlie a wide range of linguistic phenomena. Here we revise the predictions of these principles to the light of Hilberg's law on the scaling of conditional entropy in language and related laws. We show that constant entropy rate (CER) and two interpretations for uniform information density (UID), full UID and strong UID, are inconsistent with these laws. Strong UID implies CER but the reverse is not true. Full UID, a particular case of UID, leads to costly uncorrelated sequences that are totally unrealistic. We conclude that CER and its particular cases are incomplete hypotheses about the scaling of conditional entropies.


Full work available at URL: https://arxiv.org/abs/1304.7359




Recommendations



Cites Work


Cited In (4)





This page was built for publication: Constant conditional entropy and related hypotheses

Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q3301633)