Locality defeats the curse of dimensionality in convolutional teacher–student scenarios*
From MaRDI portal
Publication:5055428
DOI10.1088/1742-5468/AC98ABOpenAlexW3213007444MaRDI QIDQ5055428FDOQ5055428
Authors: Alessandro Favero, Francesco Cagnetta, Matthieu Wyart
Publication date: 13 December 2022
Published in: Journal of Statistical Mechanics: Theory and Experiment (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/2106.08619
Recommendations
- Asymptotic learning curves of kernel methods: empirical data versus teacher–student paradigm
- Learning curves of generic features maps for realistic datasets with a teacher-student model*
- Just interpolate: kernel ``ridgeless regression can generalize
- When do neural networks outperform kernel methods?*
- VC dimensions of group convolutional neural networks
Cites Work
- ImageNet
- Title not available (Why is that?)
- Bayesian learning for neural networks
- Asymptotic behavior of the eigenvalues of certain integral equations. II
- Distance-based classification with Lipschitz functions
- Breaking the curse of dimensionality with convex neural networks
- Wide neural networks of any depth evolve as linear models under gradient descent *
- Theoretical issues in deep networks
- Asymptotic learning curves of kernel methods: empirical data versus teacher–student paradigm
Uses Software
This page was built for publication: Locality defeats the curse of dimensionality in convolutional teacher–student scenarios*
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q5055428)