Algorithmic complexity bounds on future prediction errors
From MaRDI portal
Publication:865627
DOI10.1016/j.ic.2006.10.004zbMath1107.68044arXivcs/0701120WikidataQ58012403 ScholiaQ58012403MaRDI QIDQ865627
Marcus Hutter, Alexei Chernov, Jürgen Schmidhuber
Publication date: 20 February 2007
Published in: Information and Computation (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/cs/0701120
Kolmogorov complexity; randomness deficiency; Solomonoff prior; total error; future loss; monotone conditional complexity; online sequential prediction; posterior bounds
68Q30: Algorithmic information theory (Kolmogorov complexity, etc.)
Related Items
A philosophical treatise of universal induction, On calibration error of randomized forecasting algorithms, On universal prediction and Bayesian confirmation
Cites Work
- Unnamed Item
- Unnamed Item
- Universal artificial intelligence. Sequential decisions based on algorithmic probability.
- Sequential predictions based on algorithmic complexity
- HIERARCHIES OF GENERALIZED KOLMOGOROV COMPLEXITIES AND NONENUMERABLE UNIVERSAL MEASURES COMPUTABLE IN THE LIMIT
- Clustering by Compression
- Convergence and loss bounds for bayesian sequence prediction
- Complexity-based induction systems: Comparisons and convergence theorems
- Learning Theory
- Convergence and Error Bounds for Universal Prediction of Nonbinary Sequences
- 10.1162/1532443041827952
- Relations between varieties of kolmogorov complexities
- Learning Theory and Kernel Machines
- Algorithmic Learning Theory
- Algorithmic Learning Theory
- THE COMPLEXITY OF FINITE OBJECTS AND THE DEVELOPMENT OF THE CONCEPTS OF INFORMATION AND RANDOMNESS BY MEANS OF THE THEORY OF ALGORITHMS
- A formal theory of inductive inference. Part I
- New error bounds for Solomonoff prediction