Teaching and Compressing for Low VC-Dimension
From MaRDI portal
Publication:4604393
Abstract: In this work we study the quantitative relation between VC-dimension and two other basic parameters related to learning and teaching. Namely, the quality of sample compression schemes and of teaching sets for classes of low VC-dimension. Let be a binary concept class of size and VC-dimension . Prior to this work, the best known upper bounds for both parameters were , while the best lower bounds are linear in . We present significantly better upper bounds on both as follows. Set . We show that there always exists a concept in with a teaching set (i.e. a list of -labeled examples uniquely identifying in ) of size . This problem was studied by Kuhlmann (1999). Our construction implies that the recursive teaching (RT) dimension of is at most as well. The RT-dimension was suggested by Zilles et al. and Doliwa et al. (2010). The same notion (under the name partial-ID width) was independently studied by Wigderson and Yehudayoff (2013). An upper bound on this parameter that depends only on is known just for the very simple case , and is open even for . We also make small progress towards this seemingly modest goal. We further construct sample compression schemes of size for , with additional information of bits. Roughly speaking, given any list of -labelled examples of arbitrary length, we can retain only labeled examples in a way that allows to recover the labels of all others examples in the list, using additional information bits. This problem was first suggested by Littlestone and Warmuth (1986).
Recommendations
- Recursive teaching dimension, VC-dimension and sample compression
- Learning in compressed space
- Compressed representation of learning spaces
- Combined compression and classification with learning vector quantization
- Learning as Data Compression
- Unlabeled compression schemes exceeding the VC-dimension
- Sample Compression Schemes for VC Classes
Cites work
- scientific article; zbMATH DE number 67631 (Why is no real title available?)
- 10.1162/jmlr.2003.3.4-5.723
- A geometric approach to sample compression
- A theory of the learnable
- Algebraic methods proving Sauer's bound for teaching complexity
- An introduction to support vector machines and other kernel-based learning methods.
- Boosting a weak learning algorithm by majority
- Central limit theorems for empirical measures
- Combinatorial variability of Vapnik-Chervonenkis classes with applications to sample compression schemes
- Density and dimension
- Externally definable sets and dependent pairs
- Inferring decision trees using the minimum description length principle
- Learnability and the Vapnik-Chervonenkis dimension
- Learning Binary Relations and Total Orders
- Learning Integer Lattices
- Learning from different teachers
- Models of cooperative teaching and learning
- Occam's razor
- On the Uniform Convergence of Relative Frequencies of Events to Their Probabilities
- On the complexity of teaching
- On the density of families of sets
- On weak learning
- Population recovery and partial identification
- Recursive teaching dimension, learning complexity, and maximum classes
- Restriction access
- Sample Compression Schemes for VC Classes
- Shifting: one-inclusion mistake bounds and sample compression
- Sphere packing numbers for subsets of the Boolean \(n\)-cube with bounded Vapnik-Chervonenkis dimension
- Teachability in computational learning
- Teaching Dimension and the Complexity of Active Learning
- Teaching a smarter learner.
- Unlabeled compression schemes for maximum classes
- \(\epsilon\)-nets and simplex range queries
Cited in
(10)- Algebraic methods proving Sauer's bound for teaching complexity
- Bounding embeddings of VC classes into maximum classes
- The VC-dimension of axis-parallel boxes on the torus
- Labeled compression schemes for extremal classes
- On version space compression
- Recursive teaching dimension, learning complexity, and maximum classes
- Sample Compression Schemes for VC Classes
- Sauer's bound for a notion of teaching complexity
- Recursive teaching dimension, VC-dimension and sample compression
- A note on hardness of computing recursive teaching dimension
This page was built for publication: Teaching and Compressing for Low VC-Dimension
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q4604393)