Intrinsic dimension adaptive partitioning for kernel methods

From MaRDI portal
Publication:5089718

DOI10.1137/21M1435690zbMATH Open1491.68154arXiv2107.07750OpenAlexW3185132568MaRDI QIDQ5089718FDOQ5089718


Authors: Thomas Hamm, Ingo Steinwart Edit this on Wikidata


Publication date: 15 July 2022

Published in: SIAM Journal on Mathematics of Data Science (Search for Journal in Brave)

Abstract: We prove minimax optimal learning rates for kernel ridge regression, resp.~support vector machines based on a data dependent partition of the input space, where the dependence of the dimension of the input space is replaced by the fractal dimension of the support of the data generating distribution. We further show that these optimal rates can be achieved by a training validation procedure without any prior knowledge on this intrinsic dimension of the data. Finally, we conduct extensive experiments which demonstrate that our considered learning methods are actually able to generalize from a dataset that is non-trivially embedded in a much higher dimensional space just as well as from the original dataset.


Full work available at URL: https://arxiv.org/abs/2107.07750




Recommendations




Cites Work


Cited In (3)





This page was built for publication: Intrinsic dimension adaptive partitioning for kernel methods

Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q5089718)