A Vector-Contraction Inequality for Rademacher Complexities
From MaRDI portal
Publication:2830263
DOI10.1007/978-3-319-46379-7_1zbMath1478.68296arXiv1605.00251MaRDI QIDQ2830263
Publication date: 9 November 2016
Published in: Lecture Notes in Computer Science (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1605.00251
62H30: Classification and discrimination; cluster analysis (statistical aspects)
68Q32: Computational learning theory
60E15: Inequalities; stochastic orderings
68T05: Learning and adaptive systems in artificial intelligence
Related Items
Unnamed Item, Convergence Analysis of Machine Learning Algorithms for the Numerical Solution of Mean Field Control and Games I: The Ergodic Case, Weakly Convex Optimization over Stiefel Manifold Using Riemannian Subgradient-Type Methods, Graphical Convergence of Subgradients in Nonconvex Optimization and Learning, Multi-kernel learning for multi-label classification with local Rademacher complexity, Also for \(k\)-means: more data does not imply better performance, Robust \(k\)-means clustering for distributions with two moments, On strong consistency of kernel \(k\)-means: a Rademacher complexity approach, Compressive sensing and neural networks from a statistical learning perspective, From inexact optimization to learning via gradient concentration, Handling concept drift via model reuse
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Empirical margin distributions and bounding the generalization error of combined classifiers
- Transfer bounds for linear feature learning
- Optimal rates for the regularized least-squares algorithm
- Concentration Inequalities
- 10.1162/15324430260185628
- On the Performance of Clustering in Hilbert Spaces
- On the best constants in the Khinchin inequality
- 10.1162/1532443041424300
- 10.1162/153244303321897690
- $K$-Dimensional Coding Schemes in Hilbert Spaces