Bounds on the sample complexity of Bayesian learning using information theory and the VC dimension
From MaRDI portal
Publication:1314506
zbMath0798.68145MaRDI QIDQ1314506
David Haussler, Michael Kearns, Robert E. Schapire
Publication date: 3 March 1994
Published in: Machine Learning (Search for Journal in Brave)
Bayesian learningVC dimensionstatistical physicsinformation theorylearning curvesaverage-case learning
Related Items (11)
Sphere packing numbers for subsets of the Boolean \(n\)-cube with bounded Vapnik-Chervonenkis dimension ⋮ Learning from a population of hypotheses ⋮ Characterizing rational versus exponential learning curves ⋮ Mutual information, metric entropy and cumulative relative entropy risk ⋮ MetaBayes: Bayesian Meta-Interpretative Learning Using Higher-Order Stochastic Refinement ⋮ Rigorous learning curve bounds from statistical mechanics ⋮ Learning a priori constrained weighted majority votes ⋮ QG/GA: a stochastic search for Progol ⋮ Sample size lower bounds in PAC learning by Algorithmic Complexity Theory ⋮ Bayesian predictiveness, exchangeability and sufficientness in bacterial taxonomy ⋮ Query by committee, linear separation and random walks.
This page was built for publication: Bounds on the sample complexity of Bayesian learning using information theory and the VC dimension