On generalized computable universal priors and their convergence
From MaRDI portal
Publication:860822
DOI10.1016/j.tcs.2006.07.039zbMath1110.03031arXivcs/0503026WikidataQ58012433 ScholiaQ58012433MaRDI QIDQ860822
Publication date: 9 January 2007
Published in: Theoretical Computer Science (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/cs/0503026
algorithmic information theory; Martin-Löf randomness; sequence prediction; mixture distributions; posterior convergence; computability concepts; Solomonoff's prior; universal probability
68Q32: Computational learning theory
68Q30: Algorithmic information theory (Kolmogorov complexity, etc.)
03D80: Applications of computability and recursion theory
Related Items
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Universal artificial intelligence. Sequential decisions based on algorithmic probability.
- Zufälligkeit und Wahrscheinlichkeit. Eine algorithmische Begründung der Wahrscheinlichkeitstheorie. (Randomness and probability. An algorithmic foundation of probability theory)
- HIERARCHIES OF GENERALIZED KOLMOGOROV COMPLEXITIES AND NONENUMERABLE UNIVERSAL MEASURES COMPUTABLE IN THE LIMIT
- Von Mises' definition of random sequences reconsidered
- A Theory of Program Size Formally Identical to Information Theory
- Complexity-based induction systems: Comparisons and convergence theorems
- Minimum description length induction, Bayesianism, and Kolmogorov complexity
- Convergence and Error Bounds for Universal Prediction of Nonbinary Sequences
- Learning Theory and Kernel Machines
- Algorithmic Learning Theory
- THE COMPLEXITY OF FINITE OBJECTS AND THE DEVELOPMENT OF THE CONCEPTS OF INFORMATION AND RANDOMNESS BY MEANS OF THE THEORY OF ALGORITHMS
- A formal theory of inductive inference. Part II
- Algorithmic Learning Theory