An improved analysis of the Rademacher data-dependent bound using its self bounding property
From MaRDI portal
Recommendations
Cites work
- scientific article; zbMATH DE number 4170917 (Why is no real title available?)
- scientific article; zbMATH DE number 1332320 (Why is no real title available?)
- 10.1162/153244303321897690
- A sharp concentration inequality with applications
- An improved analysis of the Rademacher data-dependent bound using its self bounding property
- Aspects of discrete mathematics and probability in the theory of machine learning
- Concentration inequalities using the entropy method
- Learning pattern classification-a survey
- Local Rademacher complexities
- Local Rademacher complexity: sharper risk bounds with and without unlabeled samples
- Model selection and error estimation
- Rademacher penalties and structural risk minimization
- Some limit theorems for empirical processes (with discussion)
- Structural risk minimization over data-dependent hierarchies
Cited in
(5)- Tikhonov, Ivanov and Morozov regularization for support vector machine learning
- An improved analysis of the Rademacher data-dependent bound using its self bounding property
- Selective Rademacher penalization and reduced error pruning of decision trees
- Local Rademacher complexity: sharper risk bounds with and without unlabeled samples
- Percolation centrality via Rademacher Complexity
This page was built for publication: An improved analysis of the Rademacher data-dependent bound using its self bounding property
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q459446)