On the Inconsistency of Kernel Ridgeless Regression in Fixed Dimensions
From MaRDI portal
Publication:6070298
DOI10.1137/22m1499819arXiv2205.13525MaRDI QIDQ6070298
Mikhail Belkin, Unnamed Author, Parthe Pandit
Publication date: 20 November 2023
Published in: SIAM Journal on Mathematics of Data Science (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/2205.13525
interpolationconsistencynonparametric regressionkernel machinesbenign overfittingridgeless regression
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- The Hilbert kernel regression estimate.
- A distribution-free theory of nonparametric regression
- Surprises in high-dimensional ridgeless least squares interpolation
- Generalization error of random feature and kernel methods: hypercontractivity and kernel matrix concentration
- Just interpolate: kernel ``ridgeless regression can generalize
- Two Models of Double Descent for Weak Features
- The Generalization Error of Random Features Regression: Precise Asymptotics and the Double Descent Curve
- Benign overfitting in linear regression
- Reconciling modern machine-learning practice and the classical bias–variance trade-off
This page was built for publication: On the Inconsistency of Kernel Ridgeless Regression in Fixed Dimensions