Mean mutual information and symmetry breaking for finite random fields
From MaRDI portal
Publication:424692
DOI10.1214/11-AIHP416zbMATH Open1259.94032MaRDI QIDQ424692FDOQ424692
Authors: Jérôme Buzzi, L. Zambotti
Publication date: 4 June 2012
Published in: Annales de l'Institut Henri Poincaré. Probabilités et Statistiques (Search for Journal in Brave)
Full work available at URL: https://projecteuclid.org/euclid.aihp/1334148202
Recommendations
Measures of information, entropy (94A17) Neural networks for/in biological studies, artificial life and related topics (92B20) Combinatorial probability (60C05)
Cites Work
- Elements of Information Theory
- Title not available (Why is that?)
- Random Fragmentation and Coagulation Processes
- Title not available (Why is that?)
- Nonnegative entropy measures of multivariate symmetric correlations
- Analytical description of the evolution of neural networks: Learning rules and complexity
- Polymatroidal dependence structure of a set of random variables
- Information Inequalities for Joint Distributions, With Interpretations and Applications
- Approximate maximizers of intricacy functionals
Cited In (6)
- Measuring complexity through average symmetry
- Approximate maximizers of intricacy functionals
- Dynamical intricacy and average sample complexity of amenable group actions
- The pressure of intricacy and average sample complexity for amenable group actions
- Effects of parity, frustration, and stochastic fluctuations on integrated conceptual information for networks with two small-sized loops
- Dynamical intricacy and average sample complexity for random bundle transformations
This page was built for publication: Mean mutual information and symmetry breaking for finite random fields
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q424692)