Sequential prediction of individual sequences under general loss functions
From MaRDI portal
Publication:4701166
DOI10.1109/18.705569zbMATH Open1026.68579OpenAlexW2147632820MaRDI QIDQ4701166FDOQ4701166
Manfred K. Warmuth, David Haussler, Jyrki Kivinen
Publication date: 21 November 1999
Published in: IEEE Transactions on Information Theory (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1109/18.705569
Cited In (24)
- Does snooping help?
- Universal prediction of random binary sequences in a noisy environment
- Fast learning rates in statistical inference through aggregation
- Supermartingales in prediction with expert advice
- Regression Estimation from an Individual Stable Sequence
- Sparse regression learning by aggregation and Langevin Monte-Carlo
- Loss functions, complexities, and the Legendre transformation.
- Network Estimation by Mixing: Adaptivity and More
- Mirror averaging with sparsity priors
- The Fundamental Nature of the Log Loss Function
- Robust forecast combinations
- Generalized mirror averaging and \(D\)-convex aggregation
- A Bayesian approach to (online) transfer learning: theory and algorithms
- Sensor networks: from dependence analysis via matroid bases to online synthesis
- How many strings are easy to predict?
- Prediction with Expert Evaluators’ Advice
- Efficient learning with virtual threshold gates
- Title not available (Why is that?)
- Optimal learning with Bernstein Online Aggregation
- Randomized prediction of individual sequences
- The weak aggregating algorithm and weak mixability
- Learning by mirror averaging
- Competitive On-line Statistics
- Probability theory for the Brier game
This page was built for publication: Sequential prediction of individual sequences under general loss functions
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q4701166)