Asymptotic mutual information for the balanced binary stochastic block model

From MaRDI portal
Publication:4603702

DOI10.1093/IMAIAI/IAW017zbMATH Open1383.62021arXiv1507.08685OpenAlexW2565889208MaRDI QIDQ4603702FDOQ4603702


Authors: Yash Deshpande, Emmanuel Abbe, Andrea Montanari Edit this on Wikidata


Publication date: 19 February 2018

Published in: Information and Inference: A Journal of the IMA (Search for Journal in Brave)

Abstract: We develop an information-theoretic view of the stochastic block model, a popular statistical model for the large-scale structure of complex networks. A graph G from such a model is generated by first assigning vertex labels at random from a finite alphabet, and then connecting vertices with edge probabilities depending on the labels of the endpoints. In the case of the symmetric two-group model, we establish an explicit `single-letter' characterization of the per-vertex mutual information between the vertex labels and the graph. The explicit expression of the mutual information is intimately related to estimation-theoretic quantities, and --in particular-- reveals a phase transition at the critical point for community detection. Below the critical point the per-vertex mutual information is asymptotically the same as if edges were independent. Correspondingly, no algorithm can estimate the partition better than random guessing. Conversely, above the threshold, the per-vertex mutual information is strictly smaller than the independent-edges upper bound. In this regime there exists a procedure that estimates the vertex labels better than random guessing.


Full work available at URL: https://arxiv.org/abs/1507.08685




Recommendations




Cites Work


Cited In (34)





This page was built for publication: Asymptotic mutual information for the balanced binary stochastic block model

Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q4603702)