Consistency of Lipschitz Learning with Infinite Unlabeled Data and Finite Labeled Data
From MaRDI portal
Publication:5025772
DOI10.1137/18M1199241zbMath1499.35598arXiv1710.10364OpenAlexW2992402078WikidataQ126638071 ScholiaQ126638071MaRDI QIDQ5025772
Publication date: 3 February 2022
Published in: SIAM Journal on Mathematics of Data Science (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1710.10364
maximum principleconsistencypartial differential equationsviscosity solutionsgraph-based semi-supervised learningcontinuum limitsLipschitz learning
Nonlinear elliptic equations (35J60) Finite difference methods for boundary value problems involving PDEs (65N06) Viscosity solutions to PDEs (35D40) PDEs on graphs and networks (ramified or polygonal spaces) (35R02)
Related Items
Lipschitz Regularity of Graph Laplacians on Random Data Clouds, Convergence of Physics-Informed Neural Networks Applied to Linear Second-Order Elliptic Interface Problems, A continuum limit for the PageRank algorithm, Multiscale Elliptic PDE Upscaling and Function Approximation via Subsampled Data, Analysis and algorithms for \(\ell_p\)-based semi-supervised learning on graphs, Improved spectral convergence rates for graph Laplacians on \(\varepsilon \)-graphs and \(k\)-NN graphs, Boundary estimation from point clouds: algorithms, guarantees and applications, Data-driven method to learn the most probable transition pathway and stochastic differential equation, Solving an inverse source problem by deep neural network method with convergence and error analysis, Rates of convergence for Laplacian semi-supervised learning with low labeling rates, Poisson Reweighted Laplacian Uncertainty Sampling for Graph-Based Active Learning, The infinity Laplacian eigenvalue problem: reformulation and a numerical scheme, Continuum limit of Lipschitz learning on graphs, Function approximation via the subsampled Poincaré inequality, Gromov-Hausdorff limit of Wasserstein spaces on point clouds, On the Convergence of Physics Informed Neural Networks for Linear Second-Order Elliptic and Parabolic Type PDEs, Properly-weighted graph Laplacian for semi-supervised learning, Iterative surrogate model optimization (ISMO): an active learning algorithm for PDE constrained optimization with deep neural networks
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Continuum limit of total variation on point clouds
- Uniqueness of Lipschitz extensions: Minimizing the sup norm of the gradient
- Tug-of-war with noise: a game-theoretic view of the \(p\)-Laplacian
- Weighted nonlocal Laplacian on interpolation from sparse data
- Finite difference methods for the infinity Laplace and \(p\)-Laplace equations
- Game theoretical methods in PDEs
- Nonlinear elliptic partial differential equations and \(p\)-harmonic functions on graphs.
- On local \(U\)-statistic processes and the estimation of densities of functions of several sample variables
- A PDE-based Approach to Nondominated Sorting
- Tug-of-war and the infinity Laplacian
- Vector-valued optimal Lipschitz extensions
- User’s guide to viscosity solutions of second order partial differential equations
- Estimating Densities of Functions of Observations
- Notes on the Stationary p-Laplace Equation
- The game theoreticp-Laplacian and semi-supervised learning with few labels
- A convergent difference scheme for the infinity Laplacian: construction of absolutely minimizing Lipschitz extensions
- A tour of the theory of absolutely minimizing functions
- Analysis of $p$-Laplacian Regularization in Semisupervised Learning
- Advanced Lectures on Machine Learning
- Statistical Analysis and Modelling of Spatial Point Patterns
- Limits of Solutions ofp-Laplace Equations aspGoes to Infinity and Related Variational Problems
- Learning Theory
- The Euler equation and absolute minimizers of \(L^\infty\) functionals