Sequential prediction of individual sequences under general loss functions
From MaRDI portal
Publication:4701166
DOI10.1109/18.705569zbMATH Open1026.68579OpenAlexW2147632820MaRDI QIDQ4701166FDOQ4701166
Authors: David Haussler, Jyrki Kivinen, Manfred K. Warmuth
Publication date: 21 November 1999
Published in: IEEE Transactions on Information Theory (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1109/18.705569
Cited In (24)
- Does snooping help?
- Universal prediction of random binary sequences in a noisy environment
- The fundamental nature of the log loss function
- Prediction with expert evaluators' advice
- Fast learning rates in statistical inference through aggregation
- Supermartingales in prediction with expert advice
- Regression Estimation from an Individual Stable Sequence
- Sparse regression learning by aggregation and Langevin Monte-Carlo
- Loss functions, complexities, and the Legendre transformation.
- Network Estimation by Mixing: Adaptivity and More
- Mirror averaging with sparsity priors
- Robust forecast combinations
- Generalized mirror averaging and \(D\)-convex aggregation
- A Bayesian approach to (online) transfer learning: theory and algorithms
- Sensor networks: from dependence analysis via matroid bases to online synthesis
- How many strings are easy to predict?
- Efficient learning with virtual threshold gates
- Optimal learning with Bernstein online aggregation
- Randomized prediction of individual sequences
- The weak aggregating algorithm and weak mixability
- Learning by mirror averaging
- Competitive On-line Statistics
- Probability theory for the Brier game
- Memoryless sequences for general losses
This page was built for publication: Sequential prediction of individual sequences under general loss functions
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q4701166)