An improved analysis of the Rademacher data-dependent bound using its self bounding property
DOI10.1016/J.NEUNET.2013.03.017zbMATH Open1296.68137OpenAlexW2031690114WikidataQ30616773 ScholiaQ30616773MaRDI QIDQ459446FDOQ459446
Davide Anguita, Sandro Ridella, Luca Oneto, Alessandro Ghio
Publication date: 9 October 2014
Published in: Neural Networks (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1016/j.neunet.2013.03.017
Recommendations
Classification and discrimination; cluster analysis (statistical aspects) (62H30) Learning and adaptive systems in artificial intelligence (68T05)
Cites Work
- Title not available (Why is that?)
- Local Rademacher complexities
- Title not available (Why is that?)
- 10.1162/153244303321897690
- Concentration inequalities using the entropy method
- Structural risk minimization over data-dependent hierarchies
- Some limit theorems for empirical processes (with discussion)
- Rademacher penalties and structural risk minimization
- An improved analysis of the Rademacher data-dependent bound using its self bounding property
- Model selection and error estimation
- A sharp concentration inequality with applications
- Local Rademacher complexity: sharper risk bounds with and without unlabeled samples
- Aspects of discrete mathematics and probability in the theory of machine learning
- Learning pattern classification-a survey
Cited In (4)
- Tikhonov, Ivanov and Morozov regularization for support vector machine learning
- An improved analysis of the Rademacher data-dependent bound using its self bounding property
- Local Rademacher complexity: sharper risk bounds with and without unlabeled samples
- Percolation centrality via Rademacher Complexity
Uses Software
This page was built for publication: An improved analysis of the Rademacher data-dependent bound using its self bounding property
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q459446)