On the complexity of learning from drifting distributions
From MaRDI portal
Publication:1376422
DOI10.1006/INCO.1997.2656zbMATH Open0887.68093OpenAlexW2092714606MaRDI QIDQ1376422FDOQ1376422
Authors: Rakesh D. Barve, Philip M. Long
Publication date: 25 May 1998
Published in: Information and Computation (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1006/inco.1997.2656
Recommendations
Cites Work
- Present Position and Potential Developments: Some Personal Views: Statistical Theory: The Prequential Approach
- Convergence of stochastic processes
- On the Uniform Convergence of Relative Frequencies of Events to Their Probabilities
- On the density of families of sets
- Sharper bounds for Gaussian and empirical processes
- Title not available (Why is that?)
- Learnability and the Vapnik-Chervonenkis dimension
- The weighted majority algorithm
- A theory of the learnable
- Tracking the best expert
- A general lower bound on the number of examples needed for learning
- Probably Approximate Learning of Sets and Functions
- The hardness of approximate optima in lattices, codes, and systems of linear equations
- Title not available (Why is that?)
- Title not available (Why is that?)
- Decision theoretic generalizations of the PAC model for neural net and other learning applications
- Tracking drifting concepts by minimizing disagreements
- Predicting \(\{ 0,1\}\)-functions on randomly drawn points
- Prediction, learning, uniform convergence, and scale-sensitive dimensions
- Title not available (Why is that?)
- Title not available (Why is that?)
- Learning changing concepts by exploiting the structure of change
Cited In (11)
- Advances in Artificial Intelligence – SBIA 2004
- Generalization bounds for non-stationary mixing processes
- A no-free-lunch theorem for multitask learning
- New analysis and algorithm for learning with drifting distributions
- Learning with a drifting target concept
- Discrepancy-based theory and algorithms for forecasting non-stationary time series
- The complexity of learning according to two models of a drifting environment
- Learning distributions by their density levels: A paradigm for learning without a teacher
- Improved lower bounds for learning from noisy examples: An information-theoretic approach
- Learning from non-iid data: fast rates for the one-vs-all multiclass plug-in classifiers
- Learning changing concepts by exploiting the structure of change
This page was built for publication: On the complexity of learning from drifting distributions
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q1376422)