Local Rademacher complexity: sharper risk bounds with and without unlabeled samples
From MaRDI portal
Publication:1669081
DOI10.1016/j.neunet.2015.02.006zbMath1394.68306OpenAlexW2009784682WikidataQ50597884 ScholiaQ50597884MaRDI QIDQ1669081
Alessandro Ghio, Davide Anguita, Sandro Ridella, Luca Oneto
Publication date: 30 August 2018
Published in: Neural Networks (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1016/j.neunet.2015.02.006
Related Items (6)
Unnamed Item ⋮ An improved analysis of the Rademacher data-dependent bound using its self bounding property ⋮ Learning bounds of ERM principle for sequences of time-dependent samples ⋮ Unnamed Item ⋮ A local Vapnik-Chervonenkis complexity ⋮ Convolutional spectral kernel learning with generalization guarantees
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- An improved analysis of the Rademacher data-dependent bound using its self bounding property
- Oracle inequalities in empirical risk minimization and sparse recovery problems. École d'Été de Probabilités de Saint-Flour XXXVIII-2008.
- Fast rates for support vector machines using Gaussian kernels
- Fast learning rates for plug-in classifiers
- The Glivenko-Cantelli problem
- A Bennett concentration inequality and its application to suprema of empirical processes
- Tighter PAC-Bayes bounds through distribution-dependent priors
- Local Rademacher complexities and oracle inequalities in risk minimization. (2004 IMS Medallion Lecture). (With discussions and rejoinder)
- Local Rademacher complexities
- A sharp concentration inequality with applications
- 10.1162/153244302760200704
- 10.1162/153244303321897690
- Are Loss Functions All the Same?
- PAC-Bayesian Theory
- Advanced Lectures on Machine Learning
- Model selection and error estimation
This page was built for publication: Local Rademacher complexity: sharper risk bounds with and without unlabeled samples