Large data and zero noise limits of graph-based semi-supervised learning algorithms
DOI10.1016/J.ACHA.2019.03.005zbMATH Open1442.62768arXiv1805.09450OpenAlexW2963110350WikidataQ128098408 ScholiaQ128098408MaRDI QIDQ778036FDOQ778036
Authors: M. M. Dunlop, Dejan Slepčev, Matthew Thorpe, A. M. Stuart
Publication date: 30 June 2020
Published in: Applied and Computational Harmonic Analysis (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1805.09450
Recommendations
- On the consistency of graph-based Bayesian semi-supervised learning and the scalability of sampling algorithms
- Properly-weighted graph Laplacian for semi-supervised learning
- Continuum limit of Lipschitz learning on graphs
- Learning Theory
- Analysis of \(p\)-Laplacian regularization in semisupervised learning
Bayesian inferencekrigingsemi-supervised learningasymptotic consistencyhigher-order fractional Laplacian
Bayesian inference (62F15) Asymptotic properties of nonparametric inference (62G20) Statistical aspects of big data and data science (62R07) Learning and adaptive systems in artificial intelligence (68T05) Quadratic programming (90C20) Bayesian problems; characterization of Bayes procedures (62C10) Methods involving semicontinuity and convergence; relaxation (49J45) Programming in abstract spaces (90C48)
Cites Work
- Title not available (Why is that?)
- Title not available (Why is that?)
- Consistency of spectral clustering
- Elliptic problems in nonsmooth domains
- An introduction to \(\Gamma\)-convergence
- Riemannian manifolds with maximal eigenfunction growth
- The spectral function of an elliptic operator
- Title not available (Why is that?)
- A survey on level set methods for inverse problems and optimal design
- \(\Gamma\)-convergence of graph Ginzburg-Landau functionals
- Continuum limit of total variation on point clouds
- On the Rate of Convergence of Empirical Measures in ∞-transportation Distance
- Diffuse Interface Models on Graphs for Classification of High Dimensional Data
- UNIFORM BOUNDS FOR EIGENFUNCTIONS OF THE LAPLACIAN ON MANIFOLDS WITH BOUNDARY*
- A first course in Sobolev spaces
- MCMC methods for functions: modifying old algorithms to make them faster
- Title not available (Why is that?)
- Interpolation between weighted L\(^p\)-spaces
- Convergence and rates for fixed-interval multiple-track smoothing using \(k\)-means type optimization
- Convergence of the \(k\)-means minimization problem using \(\Gamma\)-convergence
- Title not available (Why is that?)
- Weighted nonlocal Laplacian on interpolation from sparse data
Cited In (26)
- Consistency of Lipschitz learning with infinite unlabeled data and finite labeled data
- Lipschitz regularity of graph Laplacians on random data clouds
- Properly-weighted graph Laplacian for semi-supervised learning
- Title not available (Why is that?)
- Consistency of fractional graph-Laplacian regularization in semisupervised learning with finite labels
- Poisson Reweighted Laplacian Uncertainty Sampling for Graph-Based Active Learning
- Analysis of \(p\)-Laplacian regularization in semisupervised learning
- Mathematical Foundations of Graph-Based Bayesian Semi-Supervised Learning
- A maximum principle argument for the uniform convergence of graph Laplacian regressors
- Plugin estimation of smooth optimal transport maps
- From graph cuts to isoperimetric inequalities: convergence rates of Cheeger cuts on data clouds
- Variational limits of \(k\)-NN graph-based functionals on data clouds
- Title not available (Why is that?)
- Multilevel approximation of Gaussian random fields: covariance compression, estimation, and spatial prediction
- Posterior consistency of semi-supervised regression on graphs
- Spectral analysis of weighted Laplacians arising in data clustering
- Spectral gaps and error estimates for infinite-dimensional Metropolis-Hastings with non-Gaussian priors
- Rates of convergence for Laplacian semi-supervised learning with low labeling rates
- Harmonic analysis on graphs via Bratteli diagrams and path-space measures
- Continuum limit of Lipschitz learning on graphs
- A continuum limit for the PageRank algorithm
- On the consistency of graph-based Bayesian semi-supervised learning and the scalability of sampling algorithms
- Large data limit for a phase transition model with the p-Laplacian on point clouds
- Partial differential equations and variational methods for geometric processing of images
- The game theoretic \(p\)-Laplacian and semi-supervised learning with few labels
- Improved spectral convergence rates for graph Laplacians on \(\varepsilon \)-graphs and \(k\)-NN graphs
Uses Software
This page was built for publication: Large data and zero noise limits of graph-based semi-supervised learning algorithms
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q778036)