When do neural networks outperform kernel methods?*
From MaRDI portal
Publication:5020050
DOI10.1088/1742-5468/AC3A81OpenAlexW4206434796MaRDI QIDQ5020050
No author found.
Publication date: 3 January 2022
Published in: Journal of Statistical Mechanics: Theory and Experiment (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/2006.13409
Related Items (9)
Deep learning: a statistical viewpoint ⋮ Fit without fear: remarkable mathematical phenomena of deep learning through the prism of interpolation ⋮ Generalization error of random feature and kernel methods: hypercontractivity and kernel matrix concentration ⋮ Learning curves of generic features maps for realistic datasets with a teacher-student model* ⋮ Generalization error rates in kernel regression: the crossover from the noiseless to noisy regime* ⋮ Particle dual averaging: optimization of mean field neural network with global convergence rate analysis* ⋮ Relative stability toward diffeomorphisms indicates performance in deep nets* ⋮ Algorithmic Regularization in Model-Free Overparametrized Asymmetric Matrix Factorization ⋮ The interpolation phase transition in neural networks: memorization and generalization under lazy training
Uses Software
Cites Work
This page was built for publication: When do neural networks outperform kernel methods?*