Universally consistent vertex classification for latent positions graphs
From MaRDI portal
Publication:366983
Abstract: In this work we show that, using the eigen-decomposition of the adjacency matrix, we can consistently estimate feature maps for latent position graphs with positive definite link function , provided that the latent positions are i.i.d. from some distribution F. We then consider the exploitation task of vertex classification where the link function belongs to the class of universal kernels and class labels are observed for a number of vertices tending to infinity and that the remaining vertices are to be classified. We show that minimization of the empirical -risk for some convex surrogate of 0-1 loss over a class of linear classifiers with increasing complexities yields a universally consistent classifier, that is, a classification rule with error converging to Bayes optimal for any distribution F.
Recommendations
Cites work
- scientific article; zbMATH DE number 893887 (Why is no real title available?)
- A Consistent Adjacency Spectral Embedding for Stochastic Blockmodel Graphs
- Accurate error bounds for the eigenvalues of the kernel matrix
- Consistency of spectral clustering
- Consistent adjacency-spectral partitioning for the stochastic block model when the model parameters are unknown
- Consistent nonparametric regression. Discussion
- Convexity, Classification, and Risk Bounds
- Freedman's inequality for matrix martingales
- Functional Classification in Hilbert Spaces
- Graph Laplacians and their convergence on random neighborhood graphs
- Graph limits and exchangeable random graphs
- Latent Space Approaches to Social Network Analysis
- Learning Eigenfunctions Links Spectral Embedding and Kernel PCA
- Learning Theory
- Learning Theory
- Learning Theory
- Matrix estimation by universal singular value thresholding
- Mercer's theorem on general domains: on the interaction between measures, kernels, and RKHSs
- Mixed membership stochastic blockmodels
- On learning with integral operators
- On the Bayes-risk consistency of regularized boosting methods.
- On the Eigenspectrum of the Gram Matrix and the Generalization Error of Kernel-PCA
- On the influence of the kernel on the consistency of support vector machines
- On the mathematical foundations of learning
- Random Dot Product Graph Models for Social Networks
- Random matrix approximation of spectra of integral operators
- Spectral clustering and the high-dimensional stochastic blockmodel
- Statistical behavior and consistency of classification methods based on convex risk minimization.
- Statistical performance of support vector machines
- Support vector machines are universally consistent
- The Rotation of Eigenvectors by a Perturbation. III
- The phase transition in inhomogeneous random graphs
- Universal kernels
- Universality, Characteristic Kernels and RKHS Embedding of Measures
- Variation of discrete spectra
Cited in
(29)- Entropic Optimal Transport on Random Graphs
- scientific article; zbMATH DE number 7415085 (Why is no real title available?)
- Shuffled graph classification: theory and connectome applications
- A limit theorem for scaled eigenvectors of random dot product graphs
- Maximum a posteriori inference of random dot product graphs via conic programming
- On the estimation of latent distances using graph distances
- Adaptive estimation of nonparametric geometric graphs
- Bayesian vertex nomination using content and context
- Vertex nomination
- Exponential-family models of random graphs: inference in finite, super and infinite population scenarios
- Markov random geometric graph, MRGG: a growth model for temporal dynamic networks
- Hyperlink regression via Bregman divergence
- Entrywise limit theorems for eigenvectors of signal-plus-noise matrix models with weak signals
- Testing for Equivalence of Network Distribution Using Subgraph Counts
- Statistical inference on random dot product graphs: a survey
- Stratified Stochastic Variational Inference for High-Dimensional Network Factor Model
- A probabilistic view of latent space graphs and phase transitions
- Perfect clustering for stochastic blockmodel graphs via adjacency spectral embedding
- High-dimensional Gaussian graphical models on network-linked data
- Efficient Estimation for Random Dot Product Graphs via a One-Step Procedure
- scientific article; zbMATH DE number 7626706 (Why is no real title available?)
- Community detection and percolation of information in a geometric setting
- Bayesian estimation of the latent dimension and communities in stochastic blockmodels
- Consistent nonparametric estimation for heavy-tailed sparse graphs
- Empirical Bayes estimation for the stochastic blockmodel
- Motif estimation via subgraph sampling: the fourth-moment phenomenon
- Universal latent space model fitting for large networks with edge covariates
- scientific article; zbMATH DE number 7626709 (Why is no real title available?)
- On consistent vertex nomination schemes
This page was built for publication: Universally consistent vertex classification for latent positions graphs
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q366983)