Pages that link to "Item:Q5020050"
From MaRDI portal
The following pages link to When do neural networks outperform kernel methods?* (Q5020050):
Displaying 10 items.
- The interpolation phase transition in neural networks: memorization and generalization under lazy training (Q2105197) (← links)
- Generalization error of random feature and kernel methods: hypercontractivity and kernel matrix concentration (Q2134105) (← links)
- Learning curves of generic features maps for realistic datasets with a teacher-student model* (Q5055409) (← links)
- Generalization error rates in kernel regression: the crossover from the noiseless to noisy regime* (Q5055412) (← links)
- Particle dual averaging: optimization of mean field neural network with global convergence rate analysis* (Q5055425) (← links)
- Relative stability toward diffeomorphisms indicates performance in deep nets* (Q5055429) (← links)
- Deep learning: a statistical viewpoint (Q5887827) (← links)
- Fit without fear: remarkable mathematical phenomena of deep learning through the prism of interpolation (Q5887828) (← links)
- Algorithmic Regularization in Model-Free Overparametrized Asymmetric Matrix Factorization (Q6136234) (← links)
- Learning sparse features can lead to overfitting in neural networks (Q6611436) (← links)