Overparameterization and Generalization Error: Weighted Trigonometric Interpolation
From MaRDI portal
Publication:5088865
DOI10.1137/21M1390955zbMath1493.42004arXiv2006.08495OpenAlexW3129183175MaRDI QIDQ5088865
Holger Rauhut, Hung-Hsu Chou, Rachel Ward, Yuege Xie
Publication date: 14 July 2022
Published in: SIAM Journal on Mathematics of Data Science (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/2006.08495
Trigonometric interpolation (42A15) Numerical methods for trigonometric approximation and interpolation (65T40)
Cites Work
- A mathematical introduction to compressive sensing
- Linearized two-layers neural networks in high dimension
- Surprises in high-dimensional ridgeless least squares interpolation
- Just interpolate: kernel ``ridgeless regression can generalize
- Block-circulant matrices with circulant blocks, Weil sums, and mutually unbiased bases. II. The prime power case
- Deep double descent: where bigger models and more data hurt*
- Two Models of Double Descent for Weak Features
- The Generalization Error of Random Features Regression: Precise Asymptotics and the Double Descent Curve
- Benign overfitting in linear regression
- Reconciling modern machine-learning practice and the classical bias–variance trade-off
- Scaling description of generalization with number of parameters in deep learning
This page was built for publication: Overparameterization and Generalization Error: Weighted Trigonometric Interpolation