Learning theory: stability is sufficient for generalization and necessary and sufficient for consistency of empirical risk minimization
From MaRDI portal
Publication:2498399
DOI10.1007/s10444-004-7634-zzbMath1099.68693WikidataQ56906307 ScholiaQ56906307MaRDI QIDQ2498399
Tomaso Poggio, Partha Niyogi, Sayan Mukherjee, Ryan Rifkin
Publication date: 16 August 2006
Published in: Advances in Computational Mathematics (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1007/s10444-004-7634-z
stability; consistency; inverse problems; generalization; empirical risk minimization; uniform Glivenko-Cantelli
68T05: Learning and adaptive systems in artificial intelligence
Related Items
Unnamed Item, Unnamed Item, Unnamed Item, Graph-Dependent Implicit Regularisation for Distributed Stochastic Subgradient Descent, Stability and optimization error of stochastic gradient descent for pairwise learning, LEARNING RATES OF REGULARIZED REGRESSION FOR FUNCTIONAL DATA, For interpolating kernel machines, minimizing the norm of the ERM solution maximizes stability, Stability and generalization of graph convolutional networks in eigen-domains, Neural ODE Control for Classification, Approximation, and Transport, Stability, Robustness and generalization, Derivative reproducing properties for kernel methods in learning theory, OCReP: an optimally conditioned regularization for pseudoinversion based neural training, System identification using kernel-based regularization: new insights on stability and consistency issues, Robust pairwise learning with Huber loss, A tight upper bound on the generalization error of feedforward neural networks, Design-unbiased statistical learning in survey sampling, Robustness by reweighting for kernel estimators: an overview, Learning theory: stability is sufficient for generalization and necessary and sufficient for consistency of empirical risk minimization, Supervised Learning by Support Vector Machines, Measuring the Stability of Results From Supervised Statistical Learning
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Uniform and universal Glivenko-Cantelli classes
- Type, infratype and the Elton-Pajor theorem
- A general class of exponential inequalities for martingales and ratios
- The covering number in learning theory
- Regularization networks and support vector machines
- Learning theory: stability is sufficient for generalization and necessary and sufficient for consistency of empirical risk minimization
- Majorizing measures: The generic chaining
- On the mathematical foundations of learning
- Distribution-free performance bounds for potential function rules
- Uniform Central Limit Theorems
- Scale-sensitive dimensions, uniform convergence, and learnability
- 10.1162/153244302760200704
- On the Uniform Convergence of Relative Frequencies of Events to Their Probabilities
- Convergence of stochastic processes