An efficient algorithm for Bayesian nearest neighbours

From MaRDI portal
Publication:2176367

DOI10.1007/S11009-018-9670-ZzbMATH Open1437.62235arXiv1705.09407OpenAlexW2964099208MaRDI QIDQ2176367FDOQ2176367

Giuseppe Nuti

Publication date: 4 May 2020

Published in: Methodology and Computing in Applied Probability (Search for Journal in Brave)

Abstract: K-Nearest Neighbours (k-NN) is a popular classification and regression algorithm, yet one of its main limitations is the difficulty in choosing the number of neighbours. We present a Bayesian algorithm to compute the posterior probability distribution for k given a target point within a data-set, efficiently and without the use of Markov Chain Monte Carlo (MCMC) methods or simulation - alongside an exact solution for distributions within the exponential family. The central idea is that data points around our target are generated by the same probability distribution, extending outwards over the appropriate, though unknown, number of neighbours. Once the data is projected onto a distance metric of choice, we can transform the choice of k into a change-point detection problem, for which there is an efficient solution: we recursively compute the probability of the last change-point as we move towards our target, and thus de facto compute the posterior probability distribution over k. Applying this approach to both a classification and a regression UCI data-sets, we compare favourably and, most importantly, by removing the need for simulation, we are able to compute the posterior probability of k exactly and rapidly. As an example, the computational time for the Ripley data-set is a few milliseconds compared to a few hours when using a MCMC approach.


Full work available at URL: https://arxiv.org/abs/1705.09407





Cites Work


Cited In (3)

Uses Software






This page was built for publication: An efficient algorithm for Bayesian nearest neighbours

Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2176367)