Properly-weighted graph Laplacian for semi-supervised learning

From MaRDI portal
Publication:2019913

DOI10.1007/S00245-019-09637-3zbMATH Open1465.35152arXiv1810.04351OpenAlexW2993986630WikidataQ126575701 ScholiaQ126575701MaRDI QIDQ2019913FDOQ2019913


Authors: Jeff Calder, Dejan Slepčev Edit this on Wikidata


Publication date: 22 April 2021

Published in: Applied Mathematics and Optimization (Search for Journal in Brave)

Abstract: The performance of traditional graph Laplacian methods for semi-supervised learning degrades substantially as the ratio of labeled to unlabeled data decreases, due to a degeneracy in the graph Laplacian. Several approaches have been proposed recently to address this, however we show that some of them remain ill-posed in the large-data limit. In this paper, we show a way to correctly set the weights in Laplacian regularization so that the estimator remains well posed and stable in the large-sample limit. We prove that our semi-supervised learning algorithm converges, in the infinite sample size limit, to the smooth solution of a continuum variational problem that attains the labeled values continuously. Our method is fast and easy to implement.


Full work available at URL: https://arxiv.org/abs/1810.04351




Recommendations




Cites Work


Cited In (14)

Uses Software





This page was built for publication: Properly-weighted graph Laplacian for semi-supervised learning

Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2019913)