A Vector-Contraction Inequality for Rademacher Complexities
From MaRDI portal
Publication:2830263
Abstract: The contraction inequality for Rademacher averages is extended to Lipschitz functions with vector-valued domains, and it is also shown that in the bounding expression the Rademacher variables can be replaced by arbitrary iid symmetric and sub-gaussian variables. Example applications are given for multi-category learning, K-means clustering and learning-to-learn.
Recommendations
- 10.1162/153244303321897690
- scientific article; zbMATH DE number 1804106
- Approximation error bounds via Rademacher's complexity
- Rademacher inequalities with applications
- On the $L_p$ norm of the Rademacher projection and related inequalities
- Rademacher Margin Complexity
- The Rademacher Complexity of Linear Transformation Classes
- Estimates of the approximation error using Rademacher complexity: Learning vector-valued functions
- The Khinchin Inequality for Generalized Rademacher Functions
- A \(\Phi \)-entropy contraction inequality for Gaussian vectors
Cites work
- scientific article; zbMATH DE number 49190 (Why is no real title available?)
- scientific article; zbMATH DE number 1448976 (Why is no real title available?)
- $K$-Dimensional Coding Schemes in Hilbert Spaces
- 10.1162/15324430260185628
- 10.1162/153244303321897690
- 10.1162/1532443041424300
- Concentration inequalities. A nonasymptotic theory of independence
- Empirical margin distributions and bounding the generalization error of combined classifiers
- On the Performance of Clustering in Hilbert Spaces
- On the best constants in the Khinchin inequality
- Optimal rates for the regularized least-squares algorithm
- Regularization techniques for learning with matrices
- The benefit of multitask representation learning
- Transfer bounds for linear feature learning
Cited in
(14)- Graphical convergence of subgradients in nonconvex optimization and learning
- Handling concept drift via model reuse
- Weakly convex optimization over Stiefel manifold using Riemannian subgradient-type methods
- Compressive sensing and neural networks from a statistical learning perspective
- Multi-kernel learning for multi-label classification with local Rademacher complexity
- Weight normalized deep neural networks
- Algorithmic Learning Theory
- Convergence analysis of machine learning algorithms for the numerical solution of mean field control and games. I: The ergodic case
- From inexact optimization to learning via gradient concentration
- scientific article; zbMATH DE number 7370541 (Why is no real title available?)
- On strong consistency of kernel \(k\)-means: a Rademacher complexity approach
- Also for \(k\)-means: more data does not imply better performance
- Robust \(k\)-means clustering for distributions with two moments
- Estimates of the approximation error using Rademacher complexity: Learning vector-valued functions
This page was built for publication: A Vector-Contraction Inequality for Rademacher Complexities
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2830263)