Asymptotic mutual information for the balanced binary stochastic block model
From MaRDI portal
Publication:4603702
Abstract: We develop an information-theoretic view of the stochastic block model, a popular statistical model for the large-scale structure of complex networks. A graph from such a model is generated by first assigning vertex labels at random from a finite alphabet, and then connecting vertices with edge probabilities depending on the labels of the endpoints. In the case of the symmetric two-group model, we establish an explicit `single-letter' characterization of the per-vertex mutual information between the vertex labels and the graph. The explicit expression of the mutual information is intimately related to estimation-theoretic quantities, and --in particular-- reveals a phase transition at the critical point for community detection. Below the critical point the per-vertex mutual information is asymptotically the same as if edges were independent. Correspondingly, no algorithm can estimate the partition better than random guessing. Conversely, above the threshold, the per-vertex mutual information is strictly smaller than the independent-edges upper bound. In this regime there exists a procedure that estimates the vertex labels better than random guessing.
Recommendations
Cites work
- A generalization of the Lindeberg principle
- A nonparametric view of network models and Newman–Girvan and other modularities
- Algorithms for graph partitioning on the planted partition model
- An introduction to random matrices
- An iterative construction of solutions of the TAP equations for the Sherrington-Kirkpatrick model
- Applications of the Lindeberg Principle in Communications and Statistical Learning
- Broadcasting on trees and the Ising model.
- Broken replica symmetry bounds in the mean field spin glass model
- Community detection thresholds and the weak Ramanujan property
- Conditional Random Fields, Planted Constraint Satisfaction and Entropy Concentration
- Conditional random fields, planted constraint satisfaction, and entropy concentration
- Estimation and prediction for stochastic blockmodels for graphs with latent block structure
- Exact Recovery in the Stochastic Block Model
- Exact solution of the gauge symmetric \(p\)-spin glass model on a complete graph
- Finding one community in a sparse graph
- Graph partitioning via adaptive spectral techniques
- Hill-climbing finds random planted bisections
- Mixed membership stochastic blockmodels
- Mutual Information and Minimum Mean-Square Error in Gaussian Channels
- Spectral clustering and the high-dimensional stochastic blockmodel
- State evolution for general approximate message passing algorithms, with applications to spatial coupling
- Stochastic blockmodels with a growing number of classes
- The Dynamics of Message Passing on Dense Graphs, with Applications to Compressed Sensing
- The Generalized Area Theorem and Some of its Consequences
- The Metropolis algorithm for graph bisection
- The eigenvalues and eigenvectors of finite, low rank perturbations of large random matrices
- The largest eigenvalues of finite rank deformation of large Wigner matrices: Convergence and nonuniversality of the fluctuations
- The solution of some random NP-hard problems in polynomial expected time
- Universality in polytope phase transitions and message passing algorithms
Cited in
(34)- Fundamental limits of low-rank matrix estimation with diverging aspect ratios
- Mutual information for the sparse stochastic block model
- A Unifying Tutorial on Approximate Message Passing
- Spectral Clustering via Adaptive Layer Aggregation for Multi-Layer Networks
- Universality of approximate message passing algorithms
- Phase transition in the spiked random tensor with Rademacher prior
- Approximate message passing algorithms for rotationally invariant matrices
- TAP free energy, spin glasses and variational inference
- Weighted message passing and minimum energy flow for heterogeneous stochastic block models with side information
- Estimation of low-rank matrices via approximate message passing
- Universality of approximate message passing algorithms and tensor networks
- Phase transitions in semidefinite relaxations
- An information-percolation bound for spin synchronization on general graphs
- Statistical thresholds for tensor PCA
- Optimizing mean field spin glasses with external field
- Information theoretic limits of learning a sparse rule
- The adaptive interpolation method: a simple scheme to prove replica formulas in Bayesian inference
- Fundamental limits of detection in the spiked Wigner model
- Fundamental limits of symmetric low-rank matrix estimation
- Partial recovery bounds for clustering with the relaxed \(K\)-means
- Community detection and stochastic block models: recent developments
- Estimating rank-one matrices with mismatched prior and noise: universality and large deviations
- Application of the information-percolation method to reconstruction problems on graphs
- Notes on computational hardness of hypothesis testing: predictions using the low-degree likelihood ratio
- On the computational tractability of statistical estimation on amenable graphs
- A Friendly Tutorial on Mean-Field Spin Glass Techniques for Non-Physicists
- Local convexity of the TAP free energy and AMP convergence for \(\mathbb{Z}_2\)-synchronization
- Phase transition in random tensors with multiple independent spikes
- Notes on computational-to-statistical gaps: predictions using statistical physics
- Statistical limits of spiked tensor models
- Optimality and sub-optimality of PCA. I: Spiked random matrix models
- Computational barriers to estimation from low-degree polynomials
- Global and local information in clustering labeled block models
- Entrywise eigenvector analysis of random matrices with low expected rank
This page was built for publication: Asymptotic mutual information for the balanced binary stochastic block model
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q4603702)