Stein's method, logarithmic Sobolev and transport inequalities (Q2339746)
From MaRDI portal
scientific article
Language | Label | Description | Also known as |
---|---|---|---|
English | Stein's method, logarithmic Sobolev and transport inequalities |
scientific article |
Statements
Stein's method, logarithmic Sobolev and transport inequalities (English)
0 references
2 April 2015
0 references
Let \(\nu\) be a centred probability measure on \(\mathbb{R}^d\) which has a smooth density with respect to \(\gamma\), the standard \(d\)-dimensional Gaussian measure. The classical logarithmic Sobolev inequality states that \(H(\nu|\gamma)\leq\frac{1}{2}I(\nu|\gamma)\), where \(H(\cdot|\cdot)\) is the relative entropy and \(I(\cdot|\cdot)\) is the Fisher information. One of the main results of the present paper is a strengthening of this inequality, using tools from, and connections with, Stein's method for probability approximation. The essential objects in the statement of the results of the paper are the Stein kernel and Stein discrepancy. The Stein kernel \(\tau_\nu\) is a measurable matrix-valued map on \(\mathbb{R}^d\) defined by a certain integration-by-parts formula. The Stein discrepancy \(S(\nu|\gamma)\) measures the proximity of \(\tau_\nu\) to the identity, which in turn acts as a measure of closeness of \(\nu\) to \(\gamma\). The authors' improved log-Sobolev inequality states that \[ H(\nu|\gamma)\leq\frac{1}{2}S^2(\nu|\gamma)\log\left(1+\frac{I(\nu|\gamma)}{S^2(\nu|\gamma)}\right)\,, \] which is proved by modifying the usual control of the Fisher information over the Ornstein-Uhlenbeck semigroup. A second main result of the paper improves upon Talagrand's quadratic transportation cost inequality for \(W_2(\nu,\gamma)\), the Wasserstein distance (of order 2) between \(\nu\) and \(\gamma\). The authors prove that \[ W_2(\nu,\gamma)\leq S(\nu|\gamma)\arccos\left(e^{-\frac{H(\nu|\gamma)}{S^2(\nu|\gamma)}}\right)\,. \] Such inequalities are shown to have applications to exponential convergence to equilibrium, concentration inequalities, and rates of convergence in entropic central limit theorems. The authors also prove analogous inequalities in a more general setting, for reference measures other than the Gaussian. In particular, they consider multivariate gamma distributions and families of log-concave distributions. In the final part of the present work, the authors prove entropic bounds on multidimensional functions \(F\) using data on \(F\) and its gradients, thus bypassing the condition of finiteness of the Fisher information used in the first part of their work.
0 references
entropy
0 references
Fisher information
0 references
Stein kernel
0 references
Stein discrepancy
0 references
logarithmic Sobolev inequality
0 references
transport inequality
0 references
convergence to equilibrium
0 references
concentration inequality
0 references
normal approximation
0 references
\(\Gamma\)-calculus
0 references
0 references
0 references
0 references
0 references