Asymptotics of Discrete MDL for Online Prediction
From MaRDI portal
Publication:3547504
Abstract: Minimum Description Length (MDL) is an important principle for induction and prediction, with strong relations to optimal Bayesian learning. This paper deals with learning non-i.i.d. processes by means of two-part MDL, where the underlying model class is countable. We consider the online learning framework, i.e. observations come in one by one, and the predictor is allowed to update his state of mind after each time step. We identify two ways of predicting by MDL for this setup, namely a static} and a dynamic one. (A third variant, hybrid MDL, will turn out inferior.) We will prove that under the only assumption that the data is generated by a distribution contained in the model class, the MDL predictions converge to the true values almost surely. This is accomplished by proving finite bounds on the quadratic, the Hellinger, and the Kullback-Leibler loss of the MDL learner, which are however exponentially worse than for Bayesian prediction. We demonstrate that these bounds are sharp, even for model classes containing only Bernoulli distributions. We show how these bounds imply regret bounds for arbitrary loss functions. Our results apply to a wide range of setups, namely sequence prediction, pattern classification, regression, and universal induction in the sense of Algorithmic Information Theory among others.
Recommendations
Cited in
(15)- Algorithmic Learning Theory
- Catching up Faster by Switching Sooner: A Predictive Approach to Adaptive Estimation with an Application to the AIC–BIC Dilemma
- Learning Theory
- Bridging algorithmic information theory and machine learning: a new approach to kernel learning
- Online forecast combinations of distributions: worst case bounds
- Consistency of discrete Bayesian learning
- A loss bound model for on-line stochastic prediction algorithms
- On the use of MDL principle in gene expression prediction
- Open problems in universal induction \& intelligence
- Putnam's diagonal argument and the impossibility of a universal learning machine
- Tractability of batch to sequential conversion
- Learning Theory
- Online estimation of discrete, continuous, and conditional joint densities using classifier chains
- Predictions and algorithmic statistics for infinite sequences
- Prediction and MDL for infinite sequences
This page was built for publication: Asymptotics of Discrete MDL for Online Prediction
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q3547504)