Entropy and mutual information in models of deep neural networks*
DOI10.1088/1742-5468/ab3430zbMath1459.94076arXiv1805.09785OpenAlexW2803439868MaRDI QIDQ5854116
Andre Manoel, Jean Barbier, Clément Luneau, Florent Krzakala, Marylou Gabrié, Lenka Zdeborová, Nicolas Macris
Publication date: 16 March 2021
Published in: Journal of Statistical Mechanics: Theory and Experiment (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1805.09785
Artificial neural networks and deep learning (68T07) Neural nets applied to problems in time-dependent statistical mechanics (82C32) Measures of information, entropy (94A17) Neural nets and related approaches to inference from stochastic processes (62M45)
Related Items (6)
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- High dimensional robust M-estimation: asymptotic variance via approximate message passing
- The adaptive interpolation method: a simple scheme to prove replica formulas in Bayesian inference
- Statistical Mechanics of Learning
- Information, Physics, and Computation
- Deep Neural Networks with Random Gaussian Weights: A Universal Classification Strategy?
- The adaptive interpolation method for proving replica formulas. Applications to the Curie–Weiss and Wigner spike models
- Optimal errors and phase transitions in high-dimensional generalized linear models
This page was built for publication: Entropy and mutual information in models of deep neural networks*