A Vector-Contraction Inequality for Rademacher Complexities
DOI10.1007/978-3-319-46379-7_1zbMATH Open1478.68296arXiv1605.00251OpenAlexW2962708723MaRDI QIDQ2830263FDOQ2830263
Authors: Andreas Maurer
Publication date: 9 November 2016
Published in: Lecture Notes in Computer Science (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1605.00251
Recommendations
- 10.1162/153244303321897690
- scientific article; zbMATH DE number 1804106
- Approximation error bounds via Rademacher's complexity
- Rademacher inequalities with applications
- On the $L_p$ norm of the Rademacher projection and related inequalities
- Rademacher Margin Complexity
- The Rademacher Complexity of Linear Transformation Classes
- Estimates of the approximation error using Rademacher complexity: Learning vector-valued functions
- The Khinchin Inequality for Generalized Rademacher Functions
- A \(\Phi \)-entropy contraction inequality for Gaussian vectors
Classification and discrimination; cluster analysis (statistical aspects) (62H30) Learning and adaptive systems in artificial intelligence (68T05) Inequalities; stochastic orderings (60E15) Computational learning theory (68Q32)
Cites Work
- Title not available (Why is that?)
- Transfer bounds for linear feature learning
- Concentration inequalities. A nonasymptotic theory of independence
- Empirical margin distributions and bounding the generalization error of combined classifiers
- Optimal rates for the regularized least-squares algorithm
- 10.1162/153244303321897690
- 10.1162/15324430260185628
- 10.1162/1532443041424300
- Title not available (Why is that?)
- $K$-Dimensional Coding Schemes in Hilbert Spaces
- Regularization techniques for learning with matrices
- On the Performance of Clustering in Hilbert Spaces
- On the best constants in the Khinchin inequality
- The benefit of multitask representation learning
Cited In (14)
- Estimates of the approximation error using Rademacher complexity: Learning vector-valued functions
- Multi-kernel learning for multi-label classification with local Rademacher complexity
- Handling concept drift via model reuse
- Algorithmic Learning Theory
- Weakly Convex Optimization over Stiefel Manifold Using Riemannian Subgradient-Type Methods
- Title not available (Why is that?)
- Robust \(k\)-means clustering for distributions with two moments
- From inexact optimization to learning via gradient concentration
- On strong consistency of kernel \(k\)-means: a Rademacher complexity approach
- Compressive sensing and neural networks from a statistical learning perspective
- Convergence Analysis of Machine Learning Algorithms for the Numerical Solution of Mean Field Control and Games I: The Ergodic Case
- Weight normalized deep neural networks
- Graphical Convergence of Subgradients in Nonconvex Optimization and Learning
- Also for \(k\)-means: more data does not imply better performance
This page was built for publication: A Vector-Contraction Inequality for Rademacher Complexities
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2830263)