Universally consistent vertex classification for latent positions graphs
From MaRDI portal
Publication:366983
Abstract: In this work we show that, using the eigen-decomposition of the adjacency matrix, we can consistently estimate feature maps for latent position graphs with positive definite link function , provided that the latent positions are i.i.d. from some distribution F. We then consider the exploitation task of vertex classification where the link function belongs to the class of universal kernels and class labels are observed for a number of vertices tending to infinity and that the remaining vertices are to be classified. We show that minimization of the empirical -risk for some convex surrogate of 0-1 loss over a class of linear classifiers with increasing complexities yields a universally consistent classifier, that is, a classification rule with error converging to Bayes optimal for any distribution F.
Recommendations
Cites Work
- scientific article; zbMATH DE number 893887 (Why is no real title available?)
- A Consistent Adjacency Spectral Embedding for Stochastic Blockmodel Graphs
- Accurate error bounds for the eigenvalues of the kernel matrix
- Consistency of spectral clustering
- Consistent adjacency-spectral partitioning for the stochastic block model when the model parameters are unknown
- Consistent nonparametric regression. Discussion
- Convexity, Classification, and Risk Bounds
- Freedman's inequality for matrix martingales
- Functional Classification in Hilbert Spaces
- Graph Laplacians and their convergence on random neighborhood graphs
- Graph limits and exchangeable random graphs
- Latent Space Approaches to Social Network Analysis
- Learning Eigenfunctions Links Spectral Embedding and Kernel PCA
- Learning Theory
- Learning Theory
- Learning Theory
- Matrix estimation by universal singular value thresholding
- Mercer's theorem on general domains: on the interaction between measures, kernels, and RKHSs
- Mixed membership stochastic blockmodels
- On learning with integral operators
- On the Bayes-risk consistency of regularized boosting methods.
- On the Eigenspectrum of the Gram Matrix and the Generalization Error of Kernel-PCA
- On the influence of the kernel on the consistency of support vector machines
- On the mathematical foundations of learning
- Random Dot Product Graph Models for Social Networks
- Random matrix approximation of spectra of integral operators
- Spectral clustering and the high-dimensional stochastic blockmodel
- Statistical behavior and consistency of classification methods based on convex risk minimization.
- Statistical performance of support vector machines
- Support vector machines are universally consistent
- The Rotation of Eigenvectors by a Perturbation. III
- The phase transition in inhomogeneous random graphs
- Universal kernels
- Universality, Characteristic Kernels and RKHS Embedding of Measures
- Variation of discrete spectra
Cited In (29)
- Consistent nonparametric estimation for heavy-tailed sparse graphs
- A limit theorem for scaled eigenvectors of random dot product graphs
- Maximum a posteriori inference of random dot product graphs via conic programming
- Perfect clustering for stochastic blockmodel graphs via adjacency spectral embedding
- Entropic Optimal Transport on Random Graphs
- Exponential-family models of random graphs: inference in finite, super and infinite population scenarios
- Universal latent space model fitting for large networks with edge covariates
- Adaptive estimation of nonparametric geometric graphs
- Motif estimation via subgraph sampling: the fourth-moment phenomenon
- A probabilistic view of latent space graphs and phase transitions
- Shuffled graph classification: theory and connectome applications
- Markov random geometric graph, MRGG: a growth model for temporal dynamic networks
- Bayesian vertex nomination using content and context
- Vertex nomination
- Testing for Equivalence of Network Distribution Using Subgraph Counts
- Title not available (Why is no real title available?)
- Bayesian estimation of the latent dimension and communities in stochastic blockmodels
- Entrywise limit theorems for eigenvectors of signal-plus-noise matrix models with weak signals
- Community detection and percolation of information in a geometric setting
- High-dimensional Gaussian graphical models on network-linked data
- Statistical inference on random dot product graphs: a survey
- On the estimation of latent distances using graph distances
- Empirical Bayes estimation for the stochastic blockmodel
- Title not available (Why is no real title available?)
- On consistent vertex nomination schemes
- Efficient Estimation for Random Dot Product Graphs via a One-Step Procedure
- Title not available (Why is no real title available?)
- Hyperlink regression via Bregman divergence
- Stratified Stochastic Variational Inference for High-Dimensional Network Factor Model
This page was built for publication: Universally consistent vertex classification for latent positions graphs
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q366983)