High-temperature expansions and message passing algorithms
From MaRDI portal
Publication:5854076
DOI10.1088/1742-5468/AB4BBBzbMATH Open1459.82094arXiv1906.08479OpenAlexW3098725392MaRDI QIDQ5854076FDOQ5854076
Authors: Antoine Maillard, Laura Foini, Alejandro Lage-Castellanos, Florent Krzakala, Marc Mézard, Lenka Zdeborová
Publication date: 16 March 2021
Published in: Journal of Statistical Mechanics: Theory and Experiment (Search for Journal in Brave)
Abstract: Improved mean-field technics are a central theme of statistical physics methods applied to inference and learning. We revisit here some of these methods using high-temperature expansions for disordered systems initiated by Plefka, Georges and Yedidia. We derive the Gibbs free entropy and the subsequent self-consistent equations for a generic class of statistical models with correlated matrices and show in particular that many classical approximation schemes, such as adaptive TAP, Expectation-Consistency, or the approximations behind the Vector Approximate Message Passing algorithm all rely on the same assumptions, that are also at the heart of high-temperature expansions. We focus on the case of rotationally invariant random coupling matrices in the `high-dimensional' limit in which the number of samples and the dimension are both large, but with a fixed ratio. This encapsulates many widely studied models, such as Restricted Boltzmann Machines or Generalized Linear Models with correlated data matrices. In this general setting, we show that all the approximation schemes described before are equivalent, and we conjecture that they are exact in the thermodynamic limit in the replica symmetric phases. We achieve this conclusion by resummation of the infinite perturbation series, which generalizes a seminal result of Parisi and Potters. A rigorous derivation of this conjecture is an interesting mathematical challenge. On the way to these conclusions, we uncover several diagrammatical results in connection with free probability and random matrix theory, that are interesting independently of the rest of our work.
Full work available at URL: https://arxiv.org/abs/1906.08479
Recommendations
- Mean-field inference methods for neural networks
- A Unifying Tutorial on Approximate Message Passing
- Approximate inference in Boltzmann machines
- Mean field asymptotics in high-dimensional statistics: from exact results to efficient algorithms
- A dynamical mean-field theory for learning in restricted Boltzmann machines
Cites Work
- Graphical models, exponential families, and variational inference
- Title not available (Why is that?)
- DISTRIBUTION OF EIGENVALUES FOR SOME SETS OF RANDOM MATRICES
- Compressed sensing
- An introduction to random matrices
- Title not available (Why is that?)
- Neural networks and physical systems with emergent collective computational abilities
- Title not available (Why is that?)
- The planar approximation. II
- Differential Operators on a Semisimple Lie Algebra
- Replica field theory for deterministic models: I. Binary sequences with low autocorrelation
- Replica field theory for deterministic models. II. A non-random spin glass with glassy behaviour
- A Fourier view on the \(R\)-transform and related asymptotics of spherical integrals
- New scaling of Itzykson-Zuber integrals
- Integration with respect to the Haar measure on unitary, orthogonal and symplectic group
- Information, Physics, and Computation
- Random Matrix Theory and Wireless Communications
- Expectation consistent approximate inference
- Advanced mean field methods. Theory and practice
- Mean-field theory for a spin-glass model of neural networks: TAP free energy and the paramagnetic to spin-glass transition
- Mean-field equations for spin models with orthogonal interaction matrices
- A theory of solving TAP equations for Ising models with general invariant random matrices
- Optimal errors and phase transitions in high-dimensional generalized linear models
- Vector Approximate Message Passing
- The jamming transition in high dimension: an analytical study of the TAP equations and the effective thermodynamic potential
- Perceptron capacity revisited: classification ability for correlated patterns
Cited In (9)
- Perturbative construction of mean-field equations in extensive-rank matrix factorization and denoising
- Approximate message passing algorithms for rotationally invariant matrices
- Structured random matrices and cyclic cumulants: a free probability approach
- Mean-field inference methods for neural networks
- The replica-symmetric free energy for Ising spin glasses with orthogonally invariant couplings
- Solving the spherical \textbf{\(p\)}-spin model with the cavity method: equivalence with the replica results
- Matrix denoising: Bayes-optimal estimators via low-degree polynomials
- A dynamical mean-field theory for learning in restricted Boltzmann machines
- Marginals of a spherical spin Glass model with correlated disorder
This page was built for publication: High-temperature expansions and message passing algorithms
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q5854076)