Benign Overfitting and Noisy Features
From MaRDI portal
Publication:6185582
DOI10.1080/01621459.2022.2093206arXiv2008.02901MaRDI QIDQ6185582
Zhu Li, Dino Sejdinovic, Weijie J. Su
Publication date: 8 January 2024
Published in: Journal of the American Statistical Association (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/2008.02901
Cites Work
- Surprises in high-dimensional ridgeless least squares interpolation
- Just interpolate: kernel ``ridgeless regression can generalize
- Mean field analysis of neural networks: a central limit theorem
- Convergence types and rates in generic Karhunen-Loève expansions with applications to sample path properties
- Optimal rates for the regularized least-squares algorithm
- Local Rademacher complexities
- Support Vector Machines
- A mean field view of the landscape of two-layer neural networks
- A random matrix analysis of random Fourier features: beyond the Gaussian kernel, a precise phase transition, and the corresponding double descent*
- Two Models of Double Descent for Weak Features
- The Generalization Error of Random Features Regression: Precise Asymptotics and the Double Descent Curve
- Benign overfitting in linear regression
- A universal sampling method for reconstructing signals with simple Fourier transforms
- Reconciling modern machine-learning practice and the classical bias–variance trade-off
- On the Equivalence between Kernel Quadrature Rules and Random Feature Expansions
- The elements of statistical learning. Data mining, inference, and prediction
- Neural tangent kernel: convergence and generalization in neural networks (invited paper)
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
This page was built for publication: Benign Overfitting and Noisy Features