Localization of VC classes: beyond local Rademacher complexities
DOI10.1007/978-3-319-46379-7_2zbMATH Open1398.68471arXiv1606.00922OpenAlexW3022373602MaRDI QIDQ1663641FDOQ1663641
Steve Hanneke, Nikita Zhivotovskiy
Publication date: 22 August 2018
Published in: Theoretical Computer Science, Lecture Notes in Computer Science (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1606.00922
statistical learningempirical risk minimizationVC dimensionPAC learningERMstar numberAlexander's capacitydisagreement coefficientlocal metric entropylocal Rademacher processMassart's noise conditionoffset Rademacher processshifted empirical process
Classification and discrimination; cluster analysis (statistical aspects) (62H30) Learning and adaptive systems in artificial intelligence (68T05)
Cites Work
- Title not available (Why is that?)
- Title not available (Why is that?)
- Title not available (Why is that?)
- Title not available (Why is that?)
- Title not available (Why is that?)
- Title not available (Why is that?)
- Convergence of estimates under dimensionality restrictions
- Oracle inequalities in empirical risk minimization and sparse recovery problems. École d'Été de Probabilités de Saint-Flour XXXVIII-2008.
- Sharper bounds for Gaussian and empirical processes
- Model selection in nonparametric regression
- Local Rademacher complexities and oracle inequalities in risk minimization. (2004 IMS Medallion Lecture). (With discussions and rejoinder)
- Local Rademacher complexities
- Concentration inequalities and model selection. Ecole d'Eté de Probabilités de Saint-Flour XXXIII -- 2003.
- Concentration inequalities and asymptotic results for ratio type empirical processes
- Upper and Lower Bounds for Stochastic Processes
- Combinatorial methods in density estimation
- Sphere packing numbers for subsets of the Boolean \(n\)-cube with bounded Vapnik-Chervonenkis dimension
- Neural Network Learning
- A general lower bound on the number of examples needed for learning
- Theory of Classification: a Survey of Some Recent Advances
- Rates of growth and sample moduli for weighted empirical processes indexed by sets
- Information-theoretic determination of minimax rates of convergence
- Risk bounds for statistical learning
- Learning and generalisation. With applications to neural networks.
- Theory of Disagreement-Based Active Learning
- Using the doubling dimension to analyze the generalization of learning algorithms
- Consistency for the least squares estimator in nonparametric regression
- Information-Based Complexity, Feedback and Dynamics in Convex Programming
- Empirical entropy, minimax regret and minimax risk
- Predicting \(\{ 0,1\}\)-functions on randomly drawn points
- Oracle inequalities for cross-validation type procedures
- The optimal sample complexity of PAC learning
- A new PAC bound for intersection-closed concept classes
- Empirical and Poisson processes on classes of sets or functions too large for central limit theorems
- ``Local vs. ``global parameters -- breaking the Gaussian complexity barrier
- Minimax analysis of active learning
- Refined error bounds for several learning algorithms
Cited In (1)
This page was built for publication: Localization of VC classes: beyond local Rademacher complexities
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q1663641)