Just interpolate: kernel ``ridgeless regression can generalize

From MaRDI portal
Publication:2196223

DOI10.1214/19-AOS1849zbMATH Open1453.68155arXiv1808.00387OpenAlexW3104969455MaRDI QIDQ2196223FDOQ2196223


Authors: Tengyuan Liang, Alexander Rakhlin Edit this on Wikidata


Publication date: 28 August 2020

Published in: The Annals of Statistics (Search for Journal in Brave)

Abstract: In the absence of explicit regularization, Kernel "Ridgeless" Regression with nonlinear kernels has the potential to fit the training data perfectly. It has been observed empirically, however, that such interpolated solutions can still generalize well on test data. We isolate a phenomenon of implicit regularization for minimum-norm interpolated solutions which is due to a combination of high dimensionality of the input data, curvature of the kernel function, and favorable geometric properties of the data such as an eigenvalue decay of the empirical covariance and kernel matrices. In addition to deriving a data-dependent upper bound on the out-of-sample error, we present experimental evidence suggesting that the phenomenon occurs in the MNIST dataset.


Full work available at URL: https://arxiv.org/abs/1808.00387




Recommendations




Cites Work


Cited In (45)

Uses Software





This page was built for publication: Just interpolate: kernel ``ridgeless regression can generalize

Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2196223)