Locality defeats the curse of dimensionality in convolutional teacher–student scenarios*
From MaRDI portal
Publication:5055428
Recommendations
- Asymptotic learning curves of kernel methods: empirical data versus teacher–student paradigm
- Learning curves of generic features maps for realistic datasets with a teacher-student model*
- Just interpolate: kernel ``ridgeless regression can generalize
- When do neural networks outperform kernel methods?*
- VC dimensions of group convolutional neural networks
Cites work
- scientific article; zbMATH DE number 1273988 (Why is no real title available?)
- Asymptotic behavior of the eigenvalues of certain integral equations. II
- Asymptotic learning curves of kernel methods: empirical data versus teacher–student paradigm
- Bayesian learning for neural networks
- Breaking the curse of dimensionality with convex neural networks
- Distance-based classification with Lipschitz functions
- ImageNet
- Theoretical issues in deep networks
- Wide neural networks of any depth evolve as linear models under gradient descent *
Cited in
(1)
This page was built for publication: Locality defeats the curse of dimensionality in convolutional teacher–student scenarios*
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q5055428)