Sphere packing numbers for subsets of the Boolean \(n\)-cube with bounded Vapnik-Chervonenkis dimension

From MaRDI portal
Publication:1345876


DOI10.1016/0097-3165(95)90052-7zbMath0818.60005MaRDI QIDQ1345876

David Haussler

Publication date: 7 August 1995

Published in: Journal of Combinatorial Theory. Series A (Search for Journal in Brave)

Full work available at URL: https://doi.org/10.1016/0097-3165(95)90052-7


68U05: Computer graphics; computational geometry (digital and algorithmic aspects)

60C05: Combinatorial probability

05B40: Combinatorial aspects of packing and covering


Related Items

Entropy, Randomization, Derandomization, and Discrepancy, Finite sample properties of system identification of ARX models under mixing conditions, Improved bounds on the sample complexity of learning, Nonlinear approximations using sets of finite cardinality or finite pseudo-dimension, VC bounds on the cardinality of nearly orthogonal function classes, Generalization ability of fractional polynomial models, Discrepancy, chaining and subgaussian processes, Monte Carlo algorithms for optimal stopping and statistical learning, On combinatorial testing problems, Optimal adaptive sampling recovery, Covering numbers, dyadic chaining and discrepancy, Risk bounds for statistical learning, Covering numbers, Vapnik-Červonenkis classes and bounds for the star-discrepancy, Using the doubling dimension to analyze the generalization of learning algorithms, Constrained versions of Sauer's Lemma, Corrigendum to ``Shifting: One-inclusion mistake bounds and sample compression, One-inclusion hypergraph density revisited, Learning near-optimal policies with Bellman-residual minimization based fitted policy iteration and a single sample path, A graph-theoretic generalization of the Sauer-Shelah lemma, The degree of approximation of sets in euclidean space using sets with bounded Vapnik-Chervonenkis dimension, Prediction, learning, uniform convergence, and scale-sensitive dimensions, On the learnability of rich function classes, Vapnik-Chervonenkis dimension and (pseudo-)hyperplane arrangements, Symmetrization approach to concentration inequalities for empirical processes., On prediction of individual sequences, Bayesian predictiveness, exchangeability and sufficientness in bacterial taxonomy, Improved upper bounds for probabilities of uniform deviations, Point sets on the sphere \(\mathbb{S}^{2}\) with small spherical cap discrepancy, The Lasso as an \(\ell _{1}\)-ball model selection procedure, Tractability results for the weighted star-discrepancy, Proof techniques in quasi-Monte Carlo theory, Model selection by bootstrap penalization for classification, Ranking and empirical minimization of \(U\)-statistics, Bracketing entropy and VC-dimension, Aspects of discrete mathematics and probability in the theory of machine learning, On the complexity of constrained VC-classes, Bracketing numbers for axis-parallel boxes and applications to geometric discrepancy, Approximation by neural networks and learning theory, Empirical minimization, Pseudo-dimension and entropy of manifolds formed by affine-invariant dictionary, Shifting: one-inclusion mistake bounds and sample compression, Approximation of Sobolev-type classes with quasi-seminorms, Bounds and constructions for the star-discrepancy via \(\delta\)-covers, On the orders of nonlinear approximations for classes of functions of given form, Complexities of convex combinations and bounding the generalization error in classification, Local Rademacher complexities, Probabilistic Star Discrepancy Bounds for Double Infinite Random Matrices, On the Optimality of Sample-Based Estimates of the Expectation of the Empirical Minimizer, Theory of Classification: a Survey of Some Recent Advances, Convergence of a Least‐Squares Monte Carlo Algorithm for Bounded Approximating Sets, On the VC-dimension and boolean functions with long runs



Cites Work