A metric entropy bound is not sufficient for learnability
From MaRDI portal
Publication:4838641
DOI10.1109/18.335898zbMATH Open0837.68091OpenAlexW2045888327MaRDI QIDQ4838641FDOQ4838641
Authors: Sanjeev R. Kulkarni, Ofer Zeitouni, Richard M. Dudley, T. J. Richardson
Publication date: 12 July 1995
Published in: IEEE Transactions on Information Theory (Search for Journal in Brave)
Full work available at URL: https://semanticscholar.org/paper/285fa8b9f8243fecae27f57ae27e9779941df3c2
Recommendations
- Entropy bounds on Bayesian learning
- Metric entropy limits on recurrent neural network learning of linear dynamical systems
- Learning entropy: multiscale measure for incremental learning
- Generalization bounds for metric and similarity learning
- The entropy in learning theory. Error estimates
- Nearest-neighbor entropy estimators with weak metrics
- Learning theory of minimum error entropy under weak moment conditions
- Entropy samplers and strong generic lower bounds for space bounded learning
- Learning theory approach to minimum error entropy criterion
Cited In (8)
- Realizable learning is all you need
- Title not available (Why is that?)
- Scale-sensitive dimensions and skeleton estimates for classification
- Characterizing rational versus exponential learning curves
- Some contributions to fixed-distribution learning theory
- Closure properties of uniform convergence of empirical means and PAC learnability under a family of probability measures.
- Entropy samplers and strong generic lower bounds for space bounded learning
- A recent result about random metric spaces explains why all of us have similar learning potential
This page was built for publication: A metric entropy bound is not sufficient for learnability
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q4838641)