The structure of low-complexity Gibbs measures on product spaces
DOI10.1214/19-AOP1352zbMATH Open1444.60006arXiv1810.07278MaRDI QIDQ2189463FDOQ2189463
Publication date: 15 June 2020
Published in: The Annals of Probability (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1810.07278
Recommendations
- Gaussian-width gradient complexity, reverse log-Sobolev inequalities and nonlinear large deviations
- Decomposition of mean-field Gibbs distributions into product measures
- Quantitative approximate independence for continuous mean field Gibbs measures
- Stein's method for discrete Gibbs measures
- A transportation approach to the mean-field approximation
Gibbs measuresnonlinear large deviationsdual total correlationgradient complexitymixtures of product measures
Optimal transportation (49Q22) Probability measures on topological spaces (60B05) Measures of information, entropy (94A17) Lattice systems (Ising, dimer, Potts, etc.) and systems on graphs arising in equilibrium statistical mechanics (82B20) Stochastic processes (60G99)
Cites Work
- Elements of Information Theory
- Real Analysis and Probability
- Probability in Banach spaces. Isoperimetry and processes
- Title not available (Why is that?)
- I-divergence geometry of probability distributions and minimization problems
- Measure concentration for a class of random processes
- Bounding \(\bar d\)-distance by informational divergence: A method to prove measure concentration
- A simple proof of the blowing-up lemma (Corresp.)
- Concentration of measure and isoperimetric inequalities in product spaces
- Title not available (Why is that?)
- Transportation cost for Gaussian and other product measures
- Nonlinear large deviations
- On the variational problem for upper tails in sparse random graphs
- Nonnegative entropy measures of multivariate symmetric correlations
- Concentration of measure inequalities for Markov chains and \(\Phi\)-mixing processes.
- Information inequalities and concentration of measure
- Large deviations for random graphs. École d'Été de Probabilités de Saint-Flour XLV -- 2015
- Linear dependence structure of the entropy space
- Decomposition of mean-field Gibbs distributions into product measures
- Measure concentration and the weak Pinsker property
- Gaussian-width gradient complexity, reverse log-Sobolev inequalities and nonlinear large deviations
- Upper tails and independence polynomials in random graphs
- Exponential random graphs behave like mixtures of stochastic block models
Cited In (14)
- Large deviations for the largest eigenvalue of Gaussian networks with constant average degree
- Upper tails via high moments and entropic stability
- Large deviations of subgraph counts for sparse Erdős-Rényi graphs
- Nonlinear large deviation bounds with applications to Wigner matrices and sparse Erdős-Rényi graphs
- Multi-variate correlation and mixtures of product measures
- Upper tail for homomorphism counts in constrained sparse random graphs
- A transportation approach to the mean-field approximation
- Taming correlations through entropy-efficient measure decompositions with applications to mean-field approximation
- Nonlinear large deviations: beyond the hypercube
- Spectral edge in sparse random graphs: upper and lower tail large deviations
- Upper tail of the spectral radius of sparse Erdös-Rényi graphs
- Upper Tail Large Deviations of Regular Subgraph Counts in Erdős‐Rényi Graphs in the Full Localized Regime
- Replica symmetry in upper tails of mean-field hypergraphs
- A Dimension-Free Reverse Logarithmic Sobolev Inequality for Low-Complexity Functions in Gaussian Space
This page was built for publication: The structure of low-complexity Gibbs measures on product spaces
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2189463)