Entropy of some models of sparse random graphs with vertex-names
From MaRDI portal
Publication:5416367
Abstract: Consider the setting of sparse graphs on N vertices, where the vertices have distinct "names", which are strings of length O(log N) from a fixed finite alphabet. For many natural probability models, the entropy grows as cN log N for some model-dependent rate constant c. The mathematical content of this paper is the (often easy) calculation of c for a variety of models, in particular for various standard random graph models adapted to this setting. Our broader purpose is to publicize this particular setting as a natural setting for future theoretical study of data compression for graphs, and (more speculatively) for discussion of unorganized versus organized complexity.
Recommendations
Cites work
- scientific article; zbMATH DE number 2042286 (Why is no real title available?)
- A history of graph entropy measures
- Complex graphs and networks
- Compression of Graphical Structures: Fundamental Limits, Algorithms, and Experiments
- Elements of Information Theory
- Inequalities with applications to percolation and reliability
- Pattern matching and lossy data compression on random fields
- Probability Estimation in the Rare-Events Regime
- Processes on unimodular random networks
- The \(t\)-improper chromatic number of random graphs
- The continuum random tree. I
- Universal Compression of Memoryless Sources Over Unknown Alphabets
Cited in
(2)
This page was built for publication: Entropy of some models of sparse random graphs with vertex-names
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q5416367)