Universal regression with adversarial responses
From MaRDI portal
Publication:6136596
Abstract: We provide algorithms for regression with adversarial responses under large classes of non-i.i.d. instance sequences, on general separable metric spaces, with provably minimal assumptions. We also give characterizations of learnability in this regression context. We consider universal consistency which asks for strong consistency of a learner without restrictions on the value responses. Our analysis shows that such an objective is achievable for a significantly larger class of instance sequences than stationary processes, and unveils a fundamental dichotomy between value spaces: whether finite-horizon mean estimation is achievable or not. We further provide optimistically universal learning rules, i.e., such that if they fail to achieve universal consistency, any other algorithms will fail as well. For unbounded losses, we propose a mild integrability condition under which there exist algorithms for adversarial regression under large classes of non-i.i.d. instance sequences. In addition, our analysis also provides a learning rule for mean estimation in general metric spaces that is consistent under adversarial responses without any moment conditions on the sequence, a result of independent interest.
Recommendations
- Learning whenever learning is possible: universal learning under general stochastic processes
- Learning with stochastic inputs and adversarial outputs
- Algorithmic Learning Theory
- Theory and Applications of Models of Computation
- Universal algorithms for learning theory. I: Piecewise constant functions.
Cites work
- scientific article; zbMATH DE number 1375577 (Why is no real title available?)
- scientific article; zbMATH DE number 893887 (Why is no real title available?)
- scientific article; zbMATH DE number 7415073 (Why is no real title available?)
- scientific article; zbMATH DE number 7415094 (Why is no real title available?)
- A decision-theoretic generalization of on-line learning and an application to boosting
- A distribution-free theory of nonparametric regression
- A simple randomized algorithm for sequential prediction of ergodic time series
- Consistent nonparametric regression. Discussion
- Geodesic regression and the theory of least squares on Riemannian manifolds
- How to use expert advice
- Introduction to multi-armed bandits
- Learning from dependent observations
- Nonparametric inference for ergodic, stationary time series
- On the strong universal consistency of nearest neighbor regression function estimates
- Online learning via sequential complexities
- Prediction, Learning, and Games
- Probability, Random Processes, and Ergodic Properties
- Regression Estimation from an Individual Stable Sequence
- Regret analysis of stochastic and nonstochastic multi-armed bandit problems
- Sequential Prediction of Unbounded Stationary Time Series
- Spherical regression with errors in variables
- Strong laws of large numbers for generalizations of Fréchet mean sets
- The weighted majority algorithm
- Total variation regularized Fréchet regression for metric-space valued data
- Universal Bayes consistency in metric spaces
- Universal regression with adversarial responses
This page was built for publication: Universal regression with adversarial responses
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q6136596)