A metric entropy bound is not sufficient for learnability
From MaRDI portal
Publication:4838641
Recommendations
- Entropy bounds on Bayesian learning
- Metric entropy limits on recurrent neural network learning of linear dynamical systems
- Learning entropy: multiscale measure for incremental learning
- Generalization bounds for metric and similarity learning
- The entropy in learning theory. Error estimates
- Nearest-neighbor entropy estimators with weak metrics
- Learning theory of minimum error entropy under weak moment conditions
- Entropy samplers and strong generic lower bounds for space bounded learning
- Learning theory approach to minimum error entropy criterion
Cited in
(8)- Realizable learning is all you need
- scientific article; zbMATH DE number 67635 (Why is no real title available?)
- Scale-sensitive dimensions and skeleton estimates for classification
- Characterizing rational versus exponential learning curves
- Some contributions to fixed-distribution learning theory
- Closure properties of uniform convergence of empirical means and PAC learnability under a family of probability measures.
- Entropy samplers and strong generic lower bounds for space bounded learning
- A recent result about random metric spaces explains why all of us have similar learning potential
This page was built for publication: A metric entropy bound is not sufficient for learnability
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q4838641)