Is the k-NN classifier in high dimensions affected by the curse of dimensionality?

From MaRDI portal
Publication:2629451

DOI10.1016/J.CAMWA.2012.09.011zbMATH Open1362.68248arXiv1110.4347OpenAlexW2057074619MaRDI QIDQ2629451FDOQ2629451


Authors: Vladimir G. Pestov Edit this on Wikidata


Publication date: 6 July 2016

Published in: Computers & Mathematics with Applications (Search for Journal in Brave)

Abstract: There is an increasing body of evidence suggesting that exact nearest neighbour search in high-dimensional spaces is affected by the curse of dimensionality at a fundamental level. Does it necessarily mean that the same is true for k nearest neighbours based learning algorithms such as the k-NN classifier? We analyse this question at a number of levels and show that the answer is different at each of them. As our first main observation, we show the consistency of a k approximate nearest neighbour classifier. However, the performance of the classifier in very high dimensions is provably unstable. As our second main observation, we point out that the existing model for statistical learning is oblivious of dimension of the domain and so every learning problem admits a universally consistent deterministic reduction to the one-dimensional case by means of a Borel isomorphism.


Full work available at URL: https://arxiv.org/abs/1110.4347




Recommendations




Cites Work


Cited In (8)





This page was built for publication: Is the \(k\)-NN classifier in high dimensions affected by the curse of dimensionality?

Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2629451)