Around Kolmogorov complexity: basic notions and results

From MaRDI portal
Publication:2805718

DOI10.1007/978-3-319-21852-6_7zbMATH Open1338.68130arXiv1504.04955OpenAlexW2127798320WikidataQ57349398 ScholiaQ57349398MaRDI QIDQ2805718FDOQ2805718


Authors: A. Shen Edit this on Wikidata


Publication date: 13 May 2016

Published in: Measures of Complexity (Search for Journal in Brave)

Abstract: Algorithmic information theory studies description complexity and randomness and is now a well known field of theoretical computer science and mathematical logic. There are several textbooks and monographs devoted to this theory where one can find the detailed exposition of many difficult results as well as historical references. However, it seems that a short survey of its basic notions and main results relating these notions to each other, is missing. This report attempts to fill this gap and covers the basic notions of algorithmic information theory: Kolmogorov complexity (plain, conditional, prefix), Solomonoff universal a priori probability, notions of randomness (Martin-L"of randomness, Mises--Church randomness), effective Hausdorff dimension. We prove their basic properties (symmetry of information, connection between a priori probability and prefix complexity, criterion of randomness in terms of complexity, complexity characterization for effective dimension) and show some applications (incompressibility method in computational complexity theory, incompleteness theorems). It is based on the lecture notes of a course at Uppsala University given by the author.


Full work available at URL: https://arxiv.org/abs/1504.04955




Recommendations





Cited In (26)





This page was built for publication: Around Kolmogorov complexity: basic notions and results

Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2805718)